Chen, Ming-Kai; Menard, David H; Cheng, David W
2016-03-01
In pursuit of as-low-as-reasonably-achievable (ALARA) doses, this study investigated the minimal required radioactivity and corresponding imaging time for reliable semiquantification in PET/CT imaging. Using a phantom containing spheres of various diameters (3.4, 2.1, 1.5, 1.2, and 1.0 cm) filled with a fixed (18)F-FDG concentration of 165 kBq/mL and a background concentration of 23.3 kBq/mL, we performed PET/CT at multiple time points over 20 h of radioactive decay. The images were acquired for 10 min at a single bed position for each of 10 half-lives of decay using 3-dimensional list mode and were reconstructed into 1-, 2-, 3-, 4-, 5-, and 10-min acquisitions per bed position using an ordered-subsets expectation maximum algorithm with 24 subsets and 2 iterations and a gaussian 2-mm filter. SUVmax and SUVavg were measured for each sphere. The minimal required activity (±10%) for precise SUVmax semiquantification in the spheres was 1.8 kBq/mL for an acquisition of 10 min, 3.7 kBq/mL for 3-5 min, 7.9 kBq/mL for 2 min, and 17.4 kBq/mL for 1 min. The minimal required activity concentration-acquisition time product per bed position was 10-15 kBq/mL⋅min for reproducible SUV measurements within the spheres without overestimation. Using the total radioactivity and counting rate from the entire phantom, we found that the minimal required total activity-time product was 17 MBq⋅min and the minimal required counting rate-time product was 100 kcps⋅min. Our phantom study determined a threshold for minimal radioactivity and acquisition time for precise semiquantification in (18)F-FDG PET imaging that can serve as a guide in pursuit of achieving ALARA doses. © 2016 by the Society of Nuclear Medicine and Molecular Imaging, Inc.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Filho, Faete J; Tolbert, Leon M; Ozpineci, Burak
2012-01-01
The work developed here proposes a methodology for calculating switching angles for varying DC sources in a multilevel cascaded H-bridges converter. In this approach the required fundamental is achieved, the lower harmonics are minimized, and the system can be implemented in real time with low memory requirements. Genetic algorithm (GA) is the stochastic search method to find the solution for the set of equations where the input voltages are the known variables and the switching angles are the unknown variables. With the dataset generated by GA, an artificial neural network (ANN) is trained to store the solutions without excessive memorymore » storage requirements. This trained ANN then senses the voltage of each cell and produces the switching angles in order to regulate the fundamental at 120 V and eliminate or minimize the low order harmonics while operating in real time.« less
Minimizing Input-to-Output Latency in Virtual Environment
NASA Technical Reports Server (NTRS)
Adelstein, Bernard D.; Ellis, Stephen R.; Hill, Michael I.
2009-01-01
A method and apparatus were developed to minimize latency (time delay ) in virtual environment (VE) and other discrete- time computer-base d systems that require real-time display in response to sensor input s. Latency in such systems is due to the sum of the finite time requi red for information processing and communication within and between sensors, software, and displays.
New minimally access hydrocelectomy.
Saber, Aly
2011-02-01
To ascertain the acceptability of minimally access hydrocelectomy through a 2-cm incision and the outcome in terms of morbidity reduction and recurrence rate. Although controversy exists regarding the treatment of hydrocele, hydrocelectomy remains the treatment of choice for hydroceles. However, the standard surgical procedures for hydrocele can cause postoperative discomfort and complications. A total of 42 adult patients, aged 18-56 years, underwent hydrocelectomy as an outpatient procedure using a 2-cm scrotal skin incision and excision of only a small disk of the parietal tunica vaginalis. The operative time was 12-18 minutes (mean 15). The outcome measures included patient satisfaction and postoperative complications. This procedure requires minor dissection and minimal manipulation during treatment. It also resulted in no recurrence and minimal complications and required a short operative time. Copyright © 2011 Elsevier Inc. All rights reserved.
Minimization search method for data inversion
NASA Technical Reports Server (NTRS)
Fymat, A. L.
1975-01-01
Technique has been developed for determining values of selected subsets of independent variables in mathematical formulations. Required computation time increases with first power of the number of variables. This is in contrast with classical minimization methods for which computational time increases with third power of the number of variables.
Almutairy, Meznah; Torng, Eric
2018-01-01
Bioinformatics applications and pipelines increasingly use k-mer indexes to search for similar sequences. The major problem with k-mer indexes is that they require lots of memory. Sampling is often used to reduce index size and query time. Most applications use one of two major types of sampling: fixed sampling and minimizer sampling. It is well known that fixed sampling will produce a smaller index, typically by roughly a factor of two, whereas it is generally assumed that minimizer sampling will produce faster query times since query k-mers can also be sampled. However, no direct comparison of fixed and minimizer sampling has been performed to verify these assumptions. We systematically compare fixed and minimizer sampling using the human genome as our database. We use the resulting k-mer indexes for fixed sampling and minimizer sampling to find all maximal exact matches between our database, the human genome, and three separate query sets, the mouse genome, the chimp genome, and an NGS data set. We reach the following conclusions. First, using larger k-mers reduces query time for both fixed sampling and minimizer sampling at a cost of requiring more space. If we use the same k-mer size for both methods, fixed sampling requires typically half as much space whereas minimizer sampling processes queries only slightly faster. If we are allowed to use any k-mer size for each method, then we can choose a k-mer size such that fixed sampling both uses less space and processes queries faster than minimizer sampling. The reason is that although minimizer sampling is able to sample query k-mers, the number of shared k-mer occurrences that must be processed is much larger for minimizer sampling than fixed sampling. In conclusion, we argue that for any application where each shared k-mer occurrence must be processed, fixed sampling is the right sampling method.
Torng, Eric
2018-01-01
Bioinformatics applications and pipelines increasingly use k-mer indexes to search for similar sequences. The major problem with k-mer indexes is that they require lots of memory. Sampling is often used to reduce index size and query time. Most applications use one of two major types of sampling: fixed sampling and minimizer sampling. It is well known that fixed sampling will produce a smaller index, typically by roughly a factor of two, whereas it is generally assumed that minimizer sampling will produce faster query times since query k-mers can also be sampled. However, no direct comparison of fixed and minimizer sampling has been performed to verify these assumptions. We systematically compare fixed and minimizer sampling using the human genome as our database. We use the resulting k-mer indexes for fixed sampling and minimizer sampling to find all maximal exact matches between our database, the human genome, and three separate query sets, the mouse genome, the chimp genome, and an NGS data set. We reach the following conclusions. First, using larger k-mers reduces query time for both fixed sampling and minimizer sampling at a cost of requiring more space. If we use the same k-mer size for both methods, fixed sampling requires typically half as much space whereas minimizer sampling processes queries only slightly faster. If we are allowed to use any k-mer size for each method, then we can choose a k-mer size such that fixed sampling both uses less space and processes queries faster than minimizer sampling. The reason is that although minimizer sampling is able to sample query k-mers, the number of shared k-mer occurrences that must be processed is much larger for minimizer sampling than fixed sampling. In conclusion, we argue that for any application where each shared k-mer occurrence must be processed, fixed sampling is the right sampling method. PMID:29389989
Minimizing inner product data dependencies in conjugate gradient iteration
NASA Technical Reports Server (NTRS)
Vanrosendale, J.
1983-01-01
The amount of concurrency available in conjugate gradient iteration is limited by the summations required in the inner product computations. The inner product of two vectors of length N requires time c log(N), if N or more processors are available. This paper describes an algebraic restructuring of the conjugate gradient algorithm which minimizes data dependencies due to inner product calculations. After an initial start up, the new algorithm can perform a conjugate gradient iteration in time c*log(log(N)).
Optimal Black Start Resource Allocation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Qiu, Feng; Wang, Jianhui; Chen, Chen
The restoration of the bulk power system after a partial or complete blackout relies on black-start (BS) resources. To prepare for system restoration, it is important to procure the right amount of BS resources at the right locations in the grid so that the total restoration time can be minimized. Achieving this goal requires that resource procurement planning takes the restoration process into account. In this study, we integrate the BS resource procurement decision with a restoration planning model and develop an optimization model that produces a minimal cost procurement plan that satisfies the restoration time requirement.
Cash Management in the United States Marine Corps.
1984-12-01
procedures and requires that such departments and agencies conduct financial * activities in a manner that will make cash holding require- ments...balances so as to minimize the overall cost of holding cash" [Ref. 3: 2). Simply stated, effective cash management implies the minimization of cash...balances held, as opposed to invested , as well as timely receipt and disbursement of government funds. B. ORGANIZATION RESPONSIBILITY FOR FINANCIAL
Application of quadratic optimization to supersonic inlet control.
NASA Technical Reports Server (NTRS)
Lehtinen, B.; Zeller, J. R.
1972-01-01
This paper describes the application of linear stochastic optimal control theory to the design of the control system for the air intake, the inlet, of a supersonic air-breathing propulsion system. The controls must maintain a stable inlet shock position in the presence of random airflow disturbances and prevent inlet unstart. Two different linear time invariant controllers are developed. One is designed to minimize a nonquadratic index, the expected frequency of inlet unstart, and the other is designed to minimize the mean square value of inlet shock motion. The quadratic equivalence principle is used to obtain a linear controller that minimizes the nonquadratic index. The two controllers are compared on the basis of unstart prevention, control effort requirements, and frequency response. It is concluded that while controls designed to minimize unstarts are desirable in that the index minimized is physically meaningful, computation time required is longer than for the minimum mean square shock position approach. The simpler minimum mean square shock position solution produced expected unstart frequency values which were not significantly larger than those of the nonquadratic solution.
Adaptive Implicit Non-Equilibrium Radiation Diffusion
DOE Office of Scientific and Technical Information (OSTI.GOV)
Philip, Bobby; Wang, Zhen; Berrill, Mark A
2013-01-01
We describe methods for accurate and efficient long term time integra- tion of non-equilibrium radiation diffusion systems: implicit time integration for effi- cient long term time integration of stiff multiphysics systems, local control theory based step size control to minimize the required global number of time steps while control- ling accuracy, dynamic 3D adaptive mesh refinement (AMR) to minimize memory and computational costs, Jacobian Free Newton-Krylov methods on AMR grids for efficient nonlinear solution, and optimal multilevel preconditioner components that provide level independent solver convergence.
1993-05-01
obtained to provide a nominal control history . The guidance law is found by minimizing the V second variation of the suboptimal trajectory...deviations from the suboptimal trajectory to required changes in the nominal control history . The deviations from the suboptimal trajectory, used together...with the precomputed gains, determines the change in the nominal control history required to meet the final constraints while minimizing the change in
Wang, Ming-Cheng; Lin, Wei-Hung; Yan, Jing-Jou; Fang, Hsin-Yi; Kuo, Te-Hui; Tseng, Chin-Chung; Wu, Jiunn-Jong
2015-08-01
Matrix-assisted laser desorption/ionization-time of flight mass spectrometry (MALDI-TOF MS) is a valuable method for rapid identification of blood stream infection (BSI) pathogens. Integration of MALDI-TOF MS and blood culture system can speed the identification of causative BSI microorganisms. We investigated the minimal microorganism concentrations of common BSI pathogens required for positive blood culture using BACTEC FX and for positive identification using MALDI-TOF MS. The time to detection with positive BACTEC FX and minimal incubation time with positive MALDI-TOF MS identification were determined for earlier identification of common BSI pathogens. The minimal microorganism concentrations required for positive blood culture using BACTEC FX were >10(7)-10(8) colony forming units/mL for most of the BSI pathogens. The minimal microorganism concentrations required for identification using MALDI-TOF MS were > 10(7) colony forming units/mL. Using simulated BSI models, one can obtain enough bacterial concentration from blood culture bottles for successful identification of five common Gram-positive and Gram-negative bacteria using MALDI-TOF MS 1.7-2.3 hours earlier than the usual time to detection in blood culture systems. This study provides an approach to earlier identification of BSI pathogens prior to the detection of a positive signal in the blood culture system using MALDI-TOF MS, compared to current methods. It can speed the time for identification of BSI pathogens and may have benefits of earlier therapy choice and on patient outcome. Copyright © 2013. Published by Elsevier B.V.
OxMaR: open source free software for online minimization and randomization for clinical trials.
O'Callaghan, Christopher A
2014-01-01
Minimization is a valuable method for allocating participants between the control and experimental arms of clinical studies. The use of minimization reduces differences that might arise by chance between the study arms in the distribution of patient characteristics such as gender, ethnicity and age. However, unlike randomization, minimization requires real time assessment of each new participant with respect to the preceding distribution of relevant participant characteristics within the different arms of the study. For multi-site studies, this necessitates centralized computational analysis that is shared between all study locations. Unfortunately, there is no suitable freely available open source or free software that can be used for this purpose. OxMaR was developed to enable researchers in any location to use minimization for patient allocation and to access the minimization algorithm using any device that can connect to the internet such as a desktop computer, tablet or mobile phone. The software is complete in itself and requires no special packages or libraries to be installed. It is simple to set up and run over the internet using online facilities which are very low cost or even free to the user. Importantly, it provides real time information on allocation to the study lead or administrator and generates real time distributed backups with each allocation. OxMaR can readily be modified and customised and can also be used for standard randomization. It has been extensively tested and has been used successfully in a low budget multi-centre study. Hitherto, the logistical difficulties involved in minimization have precluded its use in many small studies and this software should allow more widespread use of minimization which should lead to studies with better matched control and experimental arms. OxMaR should be particularly valuable in low resource settings.
Challenges and solutions for realistic room simulation
NASA Astrophysics Data System (ADS)
Begault, Durand R.
2002-05-01
Virtual room acoustic simulation (auralization) techniques have traditionally focused on answering questions related to speech intelligibility or musical quality, typically in large volumetric spaces. More recently, auralization techniques have been found to be important for the externalization of headphone-reproduced virtual acoustic images. Although externalization can be accomplished using a minimal simulation, data indicate that realistic auralizations need to be responsive to head motion cues for accurate localization. Computational demands increase when providing for the simulation of coupled spaces, small rooms lacking meaningful reverberant decays, or reflective surfaces in outdoor environments. Auditory threshold data for both early reflections and late reverberant energy levels indicate that much of the information captured in acoustical measurements is inaudible, minimizing the intensive computational requirements of real-time auralization systems. Results are presented for early reflection thresholds as a function of azimuth angle, arrival time, and sound-source type, and reverberation thresholds as a function of reverberation time and level within 250-Hz-2-kHz octave bands. Good agreement is found between data obtained in virtual room simulations and those obtained in real rooms, allowing a strategy for minimizing computational requirements of real-time auralization systems.
Autonomous Object Characterization with Large Datasets
2015-10-18
desk, where a substantial amount of effort is required to transform raw photometry into a data product, minimizing the amount of time the analyst has...were used to explore concepts in satellite characterization and satellite state change. The first algorithm provides real- time stability estimation... Timely and effective space object (SO) characterization is a challenge, and requires advanced data processing techniques. Detection and identification
Assignment Of Finite Elements To Parallel Processors
NASA Technical Reports Server (NTRS)
Salama, Moktar A.; Flower, Jon W.; Otto, Steve W.
1990-01-01
Elements assigned approximately optimally to subdomains. Mapping algorithm based on simulated-annealing concept used to minimize approximate time required to perform finite-element computation on hypercube computer or other network of parallel data processors. Mapping algorithm needed when shape of domain complicated or otherwise not obvious what allocation of elements to subdomains minimizes cost of computation.
Reconnaissance and Autonomy for Small Robots (RASR) team: MAGIC 2010 challenge
NASA Astrophysics Data System (ADS)
Lacaze, Alberto; Murphy, Karl; Del Giorno, Mark; Corley, Katrina
2012-06-01
The Reconnaissance and Autonomy for Small Robots (RASR) team developed a system for the coordination of groups of unmanned ground vehicles (UGVs) that can execute a variety of military relevant missions in dynamic urban environments. Historically, UGV operations have been primarily performed via tele-operation, requiring at least one dedicated operator per robot, and requiring substantial real-time bandwidth to accomplish those missions. Our team goal was to develop a system that can provide long-term value to the war-fighter, utilizing MAGIC-2010 as a stepping stone. To that end, we self-imposed a set of constraints that would force us to develop technology that could readily be used by the military in the near term: • Use a relevant (deployed) platform • Use low-cost, reliable sensors • Develop an expandable and modular control system with innovative software algorithms to minimize the computing footprint required • Minimize required communications bandwidth and handle communication losses • Minimize additional power requirements to maximize battery life and mission duration
Work intensity in sacroiliac joint fusion and lumbar microdiscectomy
Frank, Clay; Kondrashov, Dimitriy; Meyer, S Craig; Dix, Gary; Lorio, Morgan; Kovalsky, Don; Cher, Daniel
2016-01-01
Background The evidence base supporting minimally invasive sacroiliac (SI) joint fusion (SIJF) surgery is increasing. The work relative value units (RVUs) associated with minimally invasive SIJF are seemingly low. To date, only one published study describes the relative work intensity associated with minimally invasive SIJF. No study has compared work intensity vs other commonly performed spine surgery procedures. Methods Charts of 192 patients at five sites who underwent either minimally invasive SIJF (American Medical Association [AMA] CPT® code 27279) or lumbar microdiscectomy (AMA CPT® code 63030) were reviewed. Abstracted were preoperative times associated with diagnosis and patient care, intraoperative parameters including operating room (OR) in/out times and procedure start/stop times, and postoperative care requirements. Additionally, using a visual analog scale, surgeons estimated the intensity of intraoperative care, including mental, temporal, and physical demands and effort and frustration. Work was defined as operative time multiplied by task intensity. Results Patients who underwent minimally invasive SIJF were more likely female. Mean procedure times were lower in SIJF by about 27.8 minutes (P<0.0001) and mean total OR times were lower by 27.9 minutes (P<0.0001), but there was substantial overlap across procedures. Mean preservice and post-service total labor times were longer in minimally invasive SIJF (preservice times longer by 63.5 minutes [P<0.0001] and post-service labor times longer by 20.2 minutes [P<0.0001]). The number of postoperative visits was higher in minimally invasive SIJF. Mean total service time (preoperative + OR time + postoperative) was higher in the minimally invasive SIJF group (261.5 vs 211.9 minutes, P<0.0001). Intraoperative intensity levels were higher for mental, physical, effort, and frustration domains (P<0.0001 each). After taking into account intensity, intraoperative workloads showed substantial overlap. Conclusion Compared to a commonly performed lumbar spine surgical procedure, lumbar microdiscectomy, that currently has a higher work RVU, preoperative, intraoperative, and postoperative workload for minimally invasive SIJF is higher. The work RVU for minimally invasive SIJF should be adjusted upward as the relative amount of work is comparable. PMID:27555790
ERIC Educational Resources Information Center
Schreyer-Bennethum, Lynn; Albright, Leonard
2011-01-01
We report the qualitative and quantitative results of incorporating interdisciplinary application projects and increasing the use of teaching with technology into Calculus I, II and III at the University of Colorado Denver. Minimal changes were made to the curriculum and minimal time was required of instructors to make the changes. Instructors…
System requirements for head down and helmet mounted displays in the military avionics environment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Flynn, M.F.; Kalmanash, M.; Sethna, V.
1996-12-31
The introduction of flat panel display technologies into the military avionics cockpit is a challenging proposition, due to the very difficult system level requirements which must be met. These relate to environmental extremes (temperature and vibrational), sever ambient lighting conditions (10,000 fL to nighttime viewing), night vision system compatibility, and wide viewing angle. At the same time, the display system must be packaged in minimal space and use minimal power. The authors will present details on the display system requirements for both head down and helmet mounted systems, as well as information on how these challenges may be overcome.
On-patient see-through augmented reality based on visual SLAM.
Mahmoud, Nader; Grasa, Óscar G; Nicolau, Stéphane A; Doignon, Christophe; Soler, Luc; Marescaux, Jacques; Montiel, J M M
2017-01-01
An augmented reality system to visualize a 3D preoperative anatomical model on intra-operative patient is proposed. The hardware requirement is commercial tablet-PC equipped with a camera. Thus, no external tracking device nor artificial landmarks on the patient are required. We resort to visual SLAM to provide markerless real-time tablet-PC camera location with respect to the patient. The preoperative model is registered with respect to the patient through 4-6 anchor points. The anchors correspond to anatomical references selected on the tablet-PC screen at the beginning of the procedure. Accurate and real-time preoperative model alignment (approximately 5-mm mean FRE and TRE) was achieved, even when anchors were not visible in the current field of view. The system has been experimentally validated on human volunteers, in vivo pigs and a phantom. The proposed system can be smoothly integrated into the surgical workflow because it: (1) operates in real time, (2) requires minimal additional hardware only a tablet-PC with camera, (3) is robust to occlusion, (4) requires minimal interaction from the medical staff.
System analysis for technology transfer readiness assessment of horticultural postharvest
NASA Astrophysics Data System (ADS)
Hayuningtyas, M.; Djatna, T.
2018-04-01
Availability of postharvest technology is becoming abundant, but only a few technologies are applicable and useful to a wider community purposes. Based on this problem it requires a significant readiness level of transfer technology approach. This system is reliable to access readiness a technology with level, from 1-9 and to minimize time of transfer technology in every level, time required technology from the selection process can be minimum. Problem was solved by using Relief method to determine ranking by weighting feasible criteria on postharvest technology in each level and PERT (Program Evaluation Review Technique) to schedule. The results from ranking process of post-harvest technology in the field of horticulture is able to pass level 7. That, technology can be developed to increase into pilot scale and minimize time required for technological readiness on PERT with optimistic time of 7,9 years. Readiness level 9 shows that technology has been tested on the actual conditions also tied with estimated production price compared to competitors. This system can be used to determine readiness of technology innovation that is derived from agricultural raw materials and passes certain stages.
Minimizing damage to a propped fracture by controlled flowback procedures
DOE Office of Scientific and Technical Information (OSTI.GOV)
Robinson, B.M.; Holditch, S.A.; Whitehead, W.S.
1988-06-01
Severe fracture-conductivity damage can result from proppant crushing and/or proppant flowback into the wellbore. Such damage is often concentrated near the wellbore and can directly affect postfracture performance. Most of the time severe fracture-conductivity damage can be minimized by choosing the correct type of proppant for a particular well. In many cases, however, this is not enough. To minimize excessive crushing or to prevent proppant flowback, it is also necessary to control carefully the flowback of the well after the treatment. Specific procedures can be followed to minimize severe fracture-conductivity damage. These procedures involve controlling the rates at which loadmore » fluids are recovered and maximizing backpressure against the formation. These procedures require much more time and effort than is normally spent on postfracture cleanup; however, the efforts could result in better performance.« less
Lairmore, Terry C; Folek, Jessica; Govednik, Cara M; Snyder, Samuel K
2016-07-01
Minimally invasive adrenalectomy is commonly performed by either a transperitoneal laparoscopic (TLA) or posterior retroperitoneoscopic (PRA) approach. Our group described the technique for robot-assisted PRA (RAPRA) in 2010. Few studies are available that directly compare outcomes between the available operative approaches. We reviewed our results for minimally invasive adrenalectomy using the three different approaches over a 10-year period. Between January 2005 and April 2015, 160 minimally invasive adrenalectomies were performed. Clinicopathologic data were prospectively collected and retrospectively analyzed. The primary endpoints evaluated were operative time, blood loss, length of stay (LOS), and morbidity. The study included 67 TLA, 76 PRA, and 17 RAPRA procedures. Tumor size for PRA/RAPRA was smaller than for patients undergoing TLA (2.38 vs 3.6 cm, p ≤ 0.0001). Procedure time was shorter for PRA versus TLA (133.3 vs 152.8 min, p = 0.0381), as was LOS (1.85 vs 2.82 days, p = 0.0145). Procedure time was longer in RAPRA versus TLA/PRA (177 vs 153/133 min, p = 0.008), but LOS was significantly decreased (1.53 vs 2.82/1.85 days, p = 0.004). Minimally invasive adrenalectomy is associated with expected excellent outcomes regardless of approach. In our series, the posterior approach is associated with decreased operative time and LOS. Robotic technology provides potential advantages for the surgeon at the expense of more complex setup requirements and costs. Further study is required to demonstrate clear benefit of one surgical approach. Utilization of the entire spectrum of available operative techniques can allow for selection of the optimal approach based on individual patient factors.
Parametric study of minimum converter loss in an energy-storage dc-to-dc converter
NASA Technical Reports Server (NTRS)
Wong, R. C.; Owen, H. A., Jr.; Wilson, T. G.
1982-01-01
Through a combination of analytical and numerical minimization procedures, a converter design that results in the minimum total converter loss (including core loss, winding loss, capacitor and energy-storage-reactor loss, and various losses in the semiconductor switches) is obtained. Because the initial phase involves analytical minimization, the computation time required by the subsequent phase of numerical minimization is considerably reduced in this combination approach. The effects of various loss parameters on the optimum values of the design variables are also examined.
A minimal multiconfigurational technique.
Fernández Rico, J; Paniagua, M; GarcíA De La Vega, J M; Fernández-Alonso, J I; Fantucci, P
1986-04-01
A direct minimization method previously presented by the authors is applied here to biconfigurational wave functions. A very moderate increasing in the time by iteration with respect to the one-determinant calculation and good convergence properties have been found. So qualitatively correct studies on singlet systems with strong biradical character can be performed with a cost similar to that required by Hartree-Fock calculations. Copyright © 1986 John Wiley & Sons, Inc.
40 CFR 60.4333 - What are my general requirements for complying with this subpart?
Code of Federal Regulations, 2013 CFR
2013-07-01
... stationary combustion turbine, air pollution control equipment, and monitoring equipment in a manner consistent with good air pollution control practices for minimizing emissions at all times including during... of Performance for Stationary Combustion Turbines General Compliance Requirements § 60.4333 What are...
40 CFR 60.4333 - What are my general requirements for complying with this subpart?
Code of Federal Regulations, 2012 CFR
2012-07-01
... stationary combustion turbine, air pollution control equipment, and monitoring equipment in a manner consistent with good air pollution control practices for minimizing emissions at all times including during... of Performance for Stationary Combustion Turbines General Compliance Requirements § 60.4333 What are...
40 CFR 60.4333 - What are my general requirements for complying with this subpart?
Code of Federal Regulations, 2014 CFR
2014-07-01
... stationary combustion turbine, air pollution control equipment, and monitoring equipment in a manner consistent with good air pollution control practices for minimizing emissions at all times including during... of Performance for Stationary Combustion Turbines General Compliance Requirements § 60.4333 What are...
Application-oriented offloading in heterogeneous networks for mobile cloud computing
NASA Astrophysics Data System (ADS)
Tseng, Fan-Hsun; Cho, Hsin-Hung; Chang, Kai-Di; Li, Jheng-Cong; Shih, Timothy K.
2018-04-01
Nowadays Internet applications have become more complicated that mobile device needs more computing resources for shorter execution time but it is restricted to limited battery capacity. Mobile cloud computing (MCC) is emerged to tackle the finite resource problem of mobile device. MCC offloads the tasks and jobs of mobile devices to cloud and fog environments by using offloading scheme. It is vital to MCC that which task should be offloaded and how to offload efficiently. In the paper, we formulate the offloading problem between mobile device and cloud data center and propose two algorithms based on application-oriented for minimum execution time, i.e. the Minimum Offloading Time for Mobile device (MOTM) algorithm and the Minimum Execution Time for Cloud data center (METC) algorithm. The MOTM algorithm minimizes offloading time by selecting appropriate offloading links based on application categories. The METC algorithm minimizes execution time in cloud data center by selecting virtual and physical machines with corresponding resource requirements of applications. Simulation results show that the proposed mechanism not only minimizes total execution time for mobile devices but also decreases their energy consumption.
Minimum Control Requirements for Advanced Life Support Systems
NASA Technical Reports Server (NTRS)
Boulange, Richard; Jones, Harry; Jones, Harry
2002-01-01
Advanced control technologies are not necessary for the safe, reliable and continuous operation of Advanced Life Support (ALS) systems. ALS systems can and are adequately controlled by simple, reliable, low-level methodologies and algorithms. The automation provided by advanced control technologies is claimed to decrease system mass and necessary crew time by reducing buffer size and minimizing crew involvement. In truth, these approaches increase control system complexity without clearly demonstrating an increase in reliability across the ALS system. Unless these systems are as reliable as the hardware they control, there is no savings to be had. A baseline ALS system is presented with the minimal control system required for its continuous safe reliable operation. This baseline control system uses simple algorithms and scheduling methodologies and relies on human intervention only in the event of failure of the redundant backup equipment. This ALS system architecture is designed for reliable operation, with minimal components and minimal control system complexity. The fundamental design precept followed is "If it isn't there, it can't fail".
Polyhedral Interpolation for Optimal Reaction Control System Jet Selection
NASA Technical Reports Server (NTRS)
Gefert, Leon P.; Wright, Theodore
2014-01-01
An efficient algorithm is described for interpolating optimal values for spacecraft Reaction Control System jet firing duty cycles. The algorithm uses the symmetrical geometry of the optimal solution to reduce the number of calculations and data storage requirements to a level that enables implementation on the small real time flight control systems used in spacecraft. The process minimizes acceleration direction errors, maximizes control authority, and minimizes fuel consumption.
Code of Federal Regulations, 2013 CFR
2013-07-01
... stationary CI RICE 1 a. Change oil and filter every 500 hours of operation or annually, whichever comes first.2 b. Inspect air cleaner every 1,000 hours of operation or annually, whichever comes first, and... comes first, and replace as necessary.3 Minimize the engine's time spent at idle and minimize the engine...
Code of Federal Regulations, 2014 CFR
2014-07-01
... stationary CI RICE 1 a. Change oil and filter every 500 hours of operation or annually, whichever comes first.2 b. Inspect air cleaner every 1,000 hours of operation or annually, whichever comes first, and... comes first, and replace as necessary.3 Minimize the engine's time spent at idle and minimize the engine...
Iodine addition using triiodide solutions
NASA Technical Reports Server (NTRS)
Rutz, Jeffrey A.; Muckle, Susan V.; Sauer, Richard L.
1992-01-01
The study develops: a triiodide solution for use in preparing ground service equipment (GSE) water for Shuttle support, an iodine dissolution method that is reliable and requires minimal time and effort to prepare, and an iodine dissolution agent with a minimal concentration of sodium salt. Sodium iodide and hydriodic acid were both found to dissolve iodine to attain the desired GSE iodine concentrations of 7.5 +/- 2.5 mg/L and 25 +/- 5 mg/L. The 1.75:1 and 2:1 sodium iodide solutions produced higher iodine recoveries than the 1.2:1 hydriodic acid solution. A two-hour preparation time is required for the three sodium iodide solutions. The 1.2:1 hydriodic acid solution can be prepared in less than 5 min. Two sodium iodide stock solutions (2.5:1 and 2:1) were found to dissolve iodine without undergoing precipitation.
Multi-objective group scheduling optimization integrated with preventive maintenance
NASA Astrophysics Data System (ADS)
Liao, Wenzhu; Zhang, Xiufang; Jiang, Min
2017-11-01
This article proposes a single-machine-based integration model to meet the requirements of production scheduling and preventive maintenance in group production. To describe the production for identical/similar and different jobs, this integrated model considers the learning and forgetting effects. Based on machine degradation, the deterioration effect is also considered. Moreover, perfect maintenance and minimal repair are adopted in this integrated model. The multi-objective of minimizing total completion time and maintenance cost is taken to meet the dual requirements of delivery date and cost. Finally, a genetic algorithm is developed to solve this optimization model, and the computation results demonstrate that this integrated model is effective and reliable.
Bio-inspired secure data mules for medical sensor network
NASA Astrophysics Data System (ADS)
Muraleedharan, Rajani; Gao, Weihua; Osadciw, Lisa A.
2010-04-01
Medical sensor network consist of heterogeneous nodes, wireless, mobile and wired with varied functionality. The resources at each sensor require to be exploited minimally while sensitive information is sensed and communicated to its access points using secure data mules. In this paper, we analyze the flat architecture, where different functionality and priority information require varied resources forms a non-deterministic polynomial-time hard problem. Hence, a bio-inspired data mule that helps to obtain dynamic multi-objective solution with minimal resource and secure path is applied. The performance of the proposed approach is based on reduced latency, data delivery rate and resource cost.
30 CFR 717.17 - Protection of the hydrologic system.
Code of Federal Regulations, 2010 CFR
2010-07-01
..., and minimizing water contact time with waste materials, maintaining mine barriers to enhance... series of sedimentation ponds prior to leaving the permit area. All waters which flow or are removed from... requirements of this paragraph may be satisfied by submitting to the regulatory authority on the same time...
Costs and benefits of different methods of esophagectomy for esophageal cancer.
Yanasoot, Alongkorn; Yolsuriyanwong, Kamtorn; Ruangsin, Sakchai; Laohawiriyakamol, Supparerk; Sunpaweravong, Somkiat
2017-01-01
Background A minimally invasive approach to esophagectomy is being used increasingly, but concerns remain regarding the feasibility, safety, cost, and outcomes. We performed an analysis of the costs and benefits of minimally invasive, hybrid, and open esophagectomy approaches for esophageal cancer surgery. Methods The data of 83 consecutive patients who underwent a McKeown's esophagectomy at Prince of Songkla University Hospital between January 2008 and December 2014 were analyzed. Open esophagectomy was performed in 54 patients, minimally invasive esophagectomy in 13, and hybrid esophagectomy in 16. There were no differences in patient characteristics among the 3 groups Minimally invasive esophagectomy was undertaken via a thoracoscopic-laparoscopic approach, hybrid esophagectomy via a thoracoscopic-laparotomy approach, and open esophagectomy by a thoracotomy-laparotomy approach. Results Minimally invasive esophagectomy required a longer operative time than hybrid or open esophagectomy ( p = 0.02), but these patients reported less postoperative pain ( p = 0.01). There were no significant differences in blood loss, intensive care unit stay, hospital stay, or postoperative complications among the 3 groups. Minimally invasive esophagectomy incurred higher operative and surgical material costs than hybrid or open esophagectomy ( p = 0.01), but there were no significant differences in inpatient care and total hospital costs. Conclusion Minimally invasive esophagectomy resulted in the least postoperative pain but the greatest operative cost and longest operative time. Open esophagectomy was associated with the lowest operative cost and shortest operative time but the most postoperative pain. Hybrid esophagectomy had a shorter learning curve while sharing the advantages of minimally invasive esophagectomy.
The impact of preventable disruption on the operative time for minimally invasive surgery.
Al-Hakim, Latif
2011-10-01
Current ergonomic studies show that disruption exposes surgical teams to stress and musculoskeletal disorders. This study considers minimally invasive surgery as a sociotechnical process subjected to a variety of disruption events other than those recognized by ergonomic science. The research takes into consideration the impact of preventable disruption on operating time rather than on the physical and emotional status of the surgical team. Events inside operating rooms that disturbed operative time were recorded for 17 minimally invasive surgeries. The disruption events were classified into four main areas: prerequisite requirements, work design, communication during surgery, and other. Each area was further classified according to sources of disruption. Altogether, 11 sources of disruption were identified: patient record, protocol and policy, surgical requirements and surgeon preferences, operating table and patient positioning, arrangement of instruments, lighting, monitor, clothing, surgical teamwork, coordination, and other. Disruption prolonged operative time by more than 32%. Teamwork forms the main source of disruption followed by operating table and patient positioning and arrangement of instruments. These three sources represented approximately 20% of operative time. Failure to follow principles of work design had a significant negative impact, lengthening operative time by approximately 15%. Although lighting and monitors had a relatively small impact on operative time, these factors could create inconvenience and stress within the surgical teams. In addition, the effect of failure to follow surgical protocols and policies or having incomplete patient records may have a limited effect on operative time but could have serious consequences. This report demonstrates that preventable disruption caused an increase in operative time and forced surgeons and patients to endure unnecessary delay of more than 32%. Such additional time could be used to deal with the pressure of emergency cases and to reduce waiting lists for elective surgery.
Private practice outcomes: validated outcomes data collection in private practice.
Goldstein, Jack
2010-10-01
Improved patient care is related to validated outcome measures requiring the collection of three distinct data types: (1) demographics; (2) patient outcome measures; and (3) physician treatment. Previous impediments to widespread data collection have been: cost, office disruption, personnel requirements, physician motivation, data integration, and security. There are currently few means to collect data to be used for collaborative analysis. We therefore developed an inexpensive, patient-centric mechanism to reduce redundant data entry, limiting cost and personnel requirements. Using an intuitive touch-screen kiosk interface program, all data elements have been captured in a private practice setting since 2000. Developed for small to medium sized offices, this is scalable to larger organizations. Questionnaire navigation is patient driven, with demographics shared with EMR and billing systems. Integration of billing and EMR with outcomes minimizes cost and personnel time. Data are deidentified locally and may be centrally shared. Since data are entered by the patients, minimal personnel costs are incurred. Physician disincentives are minimized with cost reduction, time savings and ease of use. To date, we have collected high level data on most total joint patients, with excellent patient compliance. By addressing impediments to broad application, we may enable widespread local data collection in all practice settings. Data may be shared centrally, allowing comparative effectiveness research to become a reality. Future success will require broad physician participation, uniformity of data collected, and designation of a central site for receipt of data and its collaborative comparative analysis.
Economic impact of minimally invasive lumbar surgery.
Hofstetter, Christoph P; Hofer, Anna S; Wang, Michael Y
2015-03-18
Cost effectiveness has been demonstrated for traditional lumbar discectomy, lumbar laminectomy as well as for instrumented and noninstrumented arthrodesis. While emerging evidence suggests that minimally invasive spine surgery reduces morbidity, duration of hospitalization, and accelerates return to activites of daily living, data regarding cost effectiveness of these novel techniques is limited. The current study analyzes all available data on minimally invasive techniques for lumbar discectomy, decompression, short-segment fusion and deformity surgery. In general, minimally invasive spine procedures appear to hold promise in quicker patient recovery times and earlier return to work. Thus, minimally invasive lumbar spine surgery appears to have the potential to be a cost-effective intervention. Moreover, novel less invasive procedures are less destabilizing and may therefore be utilized in certain indications that traditionally required arthrodesis procedures. However, there is a lack of studies analyzing the economic impact of minimally invasive spine surgery. Future studies are necessary to confirm the durability and further define indications for minimally invasive lumbar spine procedures.
Choosing colors for map display icons using models of visual search.
Shive, Joshua; Francis, Gregory
2013-04-01
We show how to choose colors for icons on maps to minimize search time using predictions of a model of visual search. The model analyzes digital images of a search target (an icon on a map) and a search display (the map containing the icon) and predicts search time as a function of target-distractor color distinctiveness and target eccentricity. We parameterized the model using data from a visual search task and performed a series of optimization tasks to test the model's ability to choose colors for icons to minimize search time across icons. Map display designs made by this procedure were tested experimentally. In a follow-up experiment, we examined the model's flexibility to assign colors in novel search situations. The model fits human performance, performs well on the optimization tasks, and can choose colors for icons on maps with novel stimuli to minimize search time without requiring additional model parameter fitting. Models of visual search can suggest color choices that produce search time reductions for display icons. Designers should consider constructing visual search models as a low-cost method of evaluating color assignments.
DQM: Decentralized Quadratically Approximated Alternating Direction Method of Multipliers
NASA Astrophysics Data System (ADS)
Mokhtari, Aryan; Shi, Wei; Ling, Qing; Ribeiro, Alejandro
2016-10-01
This paper considers decentralized consensus optimization problems where nodes of a network have access to different summands of a global objective function. Nodes cooperate to minimize the global objective by exchanging information with neighbors only. A decentralized version of the alternating directions method of multipliers (DADMM) is a common method for solving this category of problems. DADMM exhibits linear convergence rate to the optimal objective but its implementation requires solving a convex optimization problem at each iteration. This can be computationally costly and may result in large overall convergence times. The decentralized quadratically approximated ADMM algorithm (DQM), which minimizes a quadratic approximation of the objective function that DADMM minimizes at each iteration, is proposed here. The consequent reduction in computational time is shown to have minimal effect on convergence properties. Convergence still proceeds at a linear rate with a guaranteed constant that is asymptotically equivalent to the DADMM linear convergence rate constant. Numerical results demonstrate advantages of DQM relative to DADMM and other alternatives in a logistic regression problem.
14 CFR 1214.403 - Code of Conduct for the International Space Station Crew.
Code of Federal Regulations, 2010 CFR
2010-01-01
... responsible for directing the mission. A Flight Director will be in charge of directing real-time ISS... advance of the mission and are designed to minimize the amount of real-time discussion required during... and impose disciplinary measures. (4) “ETOV” means Earth-to-Orbit Vehicle travelling between Earth and...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-03
... worker to obtain and post information for hoists. Total Burden Hours: 20,957. Estimated Cost (Operation... information is in the desired format, reporting burden (time and costs) is minimal, collection instruments are... accuracy of OSHA's estimate of the burden (time and costs) of the information collection requirements...
Dwell time algorithm based on the optimization theory for magnetorheological finishing
NASA Astrophysics Data System (ADS)
Zhang, Yunfei; Wang, Yang; Wang, Yajun; He, Jianguo; Ji, Fang; Huang, Wen
2010-10-01
Magnetorheological finishing (MRF) is an advanced polishing technique capable of rapidly converging to the required surface figure. This process can deterministically control the amount of the material removed by varying a time to dwell at each particular position on the workpiece surface. The dwell time algorithm is one of the most important key techniques of the MRF. A dwell time algorithm based on the1 matrix equation and optimization theory was presented in this paper. The conventional mathematical model of the dwell time was transferred to a matrix equation containing initial surface error, removal function and dwell time function. The dwell time to be calculated was just the solution to the large, sparse matrix equation. A new mathematical model of the dwell time based on the optimization theory was established, which aims to minimize the 2-norm or ∞-norm of the residual surface error. The solution meets almost all the requirements of precise computer numerical control (CNC) without any need for extra data processing, because this optimization model has taken some polishing condition as the constraints. Practical approaches to finding a minimal least-squares solution and a minimal maximum solution are also discussed in this paper. Simulations have shown that the proposed algorithm is numerically robust and reliable. With this algorithm an experiment has been performed on the MRF machine developed by ourselves. After 4.7 minutes' polishing, the figure error of a flat workpiece with a 50 mm diameter is improved by PV from 0.191λ(λ = 632.8 nm) to 0.087λ and RMS 0.041λ to 0.010λ. This algorithm can be constructed to polish workpieces of all shapes including flats, spheres, aspheres, and prisms, and it is capable of improving the polishing figures dramatically.
Gurdon, J B; Fairman, S; Mohun, T J; Brennan, S
1985-07-01
Muscle gene expression is induced a few hours after vegetal cells of a Xenopus blastula are placed in contact with animal cells that normally develop into epidermis and nerve cells. We have used a muscle-specific actin gene probe to determine the timing of gene activation in animal-vegetal conjugates. Muscle actin RNA is first transcribed in a minority of animal cells at a stage equivalent to late gastrula. The time of muscle gene activation is determined by the developmental stage of the responding (animal) cells, and not by the time when cells are first placed in contact. The minimal cell contact time required for induction is between 1 1/2 and 2 1/2 hr, and the minimal time for gene activation after induction is 5-7 hr.
Optimal boarding method for airline passengers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Steffen, Jason H.; /Fermilab
2008-02-01
Using a Markov Chain Monte Carlo optimization algorithm and a computer simulation, I find the passenger ordering which minimizes the time required to board the passengers onto an airplane. The model that I employ assumes that the time that a passenger requires to load his or her luggage is the dominant contribution to the time needed to completely fill the aircraft. The optimal boarding strategy may reduce the time required to board and airplane by over a factor of four and possibly more depending upon the dimensions of the aircraft. I explore some features of the optimal boarding method andmore » discuss practical modifications to the optimal. Finally, I mention some of the benefits that could come from implementing an improved passenger boarding scheme.« less
NASA Astrophysics Data System (ADS)
Hakim Halim, Abdul; Ernawati; Hidayat, Nita P. A.
2018-03-01
This paper deals with a model of batch scheduling for a single batch processor on which a number of parts of a single items are to be processed. The process needs two kinds of setups, i. e., main setups required before processing any batches, and additional setups required repeatedly after the batch processor completes a certain number of batches. The parts to be processed arrive at the shop floor at the times coinciding with their respective starting times of processing, and the completed parts are to be delivered at multiple due dates. The objective adopted for the model is that of minimizing total inventory holding cost consisting of holding cost per unit time for a part in completed batches, and that in in-process batches. The formulation of total inventory holding cost is derived from the so-called actual flow time defined as the interval between arrival times of parts at the production line and delivery times of the completed parts. The actual flow time satisfies not only minimum inventory but also arrival and delivery just in times. An algorithm to solve the model is proposed and a numerical example is shown.
Rapid rehabilitation and recovery with minimally invasive total hip arthroplasty.
Berger, Richard A; Jacobs, Joshua J; Meneghini, R Michael; Della Valle, Craig; Paprosky, Wayne; Rosenberg, Aaron G
2004-12-01
To assess the potential recovery rate of a minimally invasive total hip replacement technique with minimal soft tissue disruption, an accelerated rehabilitation protocol was implemented with weightbearing as tolerated on the day of surgery. One hundred consecutive patients were enrolled in this prospective study. Ninety-seven patients (97%) met all the inpatient physical therapy goals required for discharge to home on the day of surgery; 100% of patients achieved these goals within 23 hours of surgery. Outpatient therapy was initiated in 9% of patients immediately, 62% of patients by 1 week, and all patients by 2 weeks. The mean time to discontinued use of crutches, discontinued use of narcotic pain medications, and resumed driving was 6 days postoperatively. The mean time to return to work was 8 days, discontinued use of any assistive device was 9 days, and resumption of all activities of daily living was 10 days. The mean time to walk (1/2) mile was 16 days. Furthermore, there were no readmissions, no dislocations, and no reoperations. Therefore, a rapid rehabilitation protocol is safe and fulfills the potential benefits of a rapid recovery with minimally invasive total hip arthroplasty.
Minimizing EVA Airlock Time and Depress Gas Losses
NASA Technical Reports Server (NTRS)
Trevino, Luis A.; Lafuse, Sharon A.
2008-01-01
This paper describes the need and solution for minimizing EVA airlock time and depress gas losses using a new method that minimizes EVA out-the-door time for a suited astronaut and reclaims most of the airlock depress gas. This method consists of one or more related concepts that use an evacuated reservoir tank to store and reclaim the airlock depress gas. The evacuated tank can be an inflatable tank, a spent fuel tank from a lunar lander descent stage, or a backup airlock. During EVA airlock operations, the airlock and reservoir would be equalized at some low pressure, and through proper selection of reservoir size, most of the depress gas would be stored in the reservoir for later reclamation. The benefit of this method is directly applicable to long duration lunar and Mars missions that require multiple EVA missions (up to 100, two-person lunar EVAs) and conservation of consumables, including depress pump power and depress gas. The current ISS airlock gas reclamation method requires approximately 45 minutes of the astronaut s time in the airlock and 1 KW in electrical power. The proposed method would decrease the astronaut s time in the airlock because the depress gas is being temporarily stored in a reservoir tank for later recovery. Once the EVA crew is conducting the EVA, the volume in the reservoir would be pumped back to the cabin at a slow rate. Various trades were conducted to optimize this method, which include time to equalize the airlock with the evacuated reservoir versus reservoir size, pump power to reclaim depress gas versus time allotted, inflatable reservoir pros and cons (weight, volume, complexity), and feasibility of spent lunar nitrogen and oxygen tanks as reservoirs.
Dissociation between Complete Hippocampal Context Memory Formation and Context Fear Acquisition
ERIC Educational Resources Information Center
Leake, Jessica; Zinn, Raphael; Corbit, Laura; Vissel, Bryce
2017-01-01
Rodents require a minimal time period to explore a context prior to footshock to display plateau-level context fear at test. To investigate whether this rapid fear plateau reflects complete memory formation within that short time-frame, we used the immediate-early gene product Arc as an indicator of hippocampal context memory formation-related…
Connolly, M K; Cooper, C E
2014-12-01
Metabolic rate and evaporative water loss are two commonly measured physiological variables. It is therefore important, especially for comparative studies, that these variables (and others) are measured under standardised conditions, of which a resting state during the inactive phase is part of the accepted criteria. Here we show how measurement duration and timing affect these criteria and impact on the estimation of basal metabolic rate (oxygen consumption and carbon dioxide production) and standard evaporative water loss of a small nocturnal rodent. Oxygen consumption, carbon dioxide production and evaporative water loss all decreased over the duration of an experiment. Random assortment of hourly values indicated that this was an animal rather than a random effect for up to 11h. Experimental start time also had a significant effect on measurement of physiological variables. A longer time period was required to achieve minimal carbon dioxide consumption and evaporative water loss when experiments commenced earlier in the day; however, experiments with earlier start times had a lower overall estimates of minimal oxygen consumption and carbon dioxide production. For this species, measurement duration of at least 8h, ideally commencing between before the inactive phase at 03:00h and 05:00h, is required to obtain minimal standard values for physiological variables. Up to 80% of recently published studies measuring basal metabolic rate and/or evaporative water loss of small nocturnal mammals may overestimate basal values due to insufficiently long measurement duration. Copyright © 2014 Elsevier Inc. All rights reserved.
Cognitive radio adaptation for power consumption minimization using biogeography-based optimization
NASA Astrophysics Data System (ADS)
Qi, Pei-Han; Zheng, Shi-Lian; Yang, Xiao-Niu; Zhao, Zhi-Jin
2016-12-01
Adaptation is one of the key capabilities of cognitive radio, which focuses on how to adjust the radio parameters to optimize the system performance based on the knowledge of the radio environment and its capability and characteristics. In this paper, we consider the cognitive radio adaptation problem for power consumption minimization. The problem is formulated as a constrained power consumption minimization problem, and the biogeography-based optimization (BBO) is introduced to solve this optimization problem. A novel habitat suitability index (HSI) evaluation mechanism is proposed, in which both the power consumption minimization objective and the quality of services (QoS) constraints are taken into account. The results show that under different QoS requirement settings corresponding to different types of services, the algorithm can minimize power consumption while still maintaining the QoS requirements. Comparison with particle swarm optimization (PSO) and cat swarm optimization (CSO) reveals that BBO works better, especially at the early stage of the search, which means that the BBO is a better choice for real-time applications. Project supported by the National Natural Science Foundation of China (Grant No. 61501356), the Fundamental Research Funds of the Ministry of Education, China (Grant No. JB160101), and the Postdoctoral Fund of Shaanxi Province, China.
76 FR 9559 - Proposed Information Collection; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2011-02-18
..., reporting burden (time and financial resources) is minimized, collection instruments are clearly understood... appropriate data for the Corporation's required performance measurement and other reporting. DATES: Written... proper performance of the functions of the Corporation, including whether the information will have...
Code of Federal Regulations, 2014 CFR
2014-07-01
... and Human Services, Payment Management System, P.O. Box 6021, Rockville, MD 20852. Interest amounts up... FOREIGN GOVERNMENTS, AND INTERNATIONAL ORGANIZATIONS Post-Award Requirements Financial and Program Management § 95.22 Payment. (a) Payment methods shall minimize the time elapsing between the transfer of...
Future Visions for Scientific Human Exploration
NASA Technical Reports Server (NTRS)
Garvin, James
2005-01-01
Today, humans explore deep-space locations such as Mars, asteroids, and beyond, vicariously here on Earth, with noteworthy success. However, to achieve the revolutionary breakthroughs that have punctuated the history of science since the dawn of the Space Age has always required humans as "the discoverers," as Daniel Boorstin contends in this book of the same name. During Apollo 17, human explorers on the lunar surface discovered the "genesis rock," orange glass, and humans in space revamped the optically crippled Hubble Space Telescope to enable some of the greatest astronomical discoveries of all time. Science-driven human exploration is about developing the opportunities for such events, perhaps associated with challenging problems such as whether we can identify life beyond Earth within the universe. At issue, however, is how to safely insert humans and the spaceflight systems required to allow humans to operate as they do best in the hostile environment of deep space. The first issue is minimizing the problems associated with human adaptation to the most challenging aspects of deep space space radiation and microgravity (or non-Earth gravity). One solution path is to develop technologies that allow for minimization of the exposure time of people to deep space, as was accomplished in Apollo. For a mission to the planet Mars, this might entail new technological solutions for in-space propulsion that would make possible time-minimized transfers to and from Mars. The problem of rapid, reliable in-space transportation is challenged by the celestial mechanics of moving in space and the so-called "rocket equation." To travel to Mars from Earth in less than the time fuel-minimizing trajectories allow (i.e., Hohmann transfers) requires an exponential increase in the amount of fuel. Thus, month-long transits would require a mass of fuel as large as the dry mass of the ISS, assuming the existence of continuous acceleration engines. This raises the largest technological stumbling block to moving humans on site as deep-space explorers, delivering the masses required for human spaceflight systems to LEO or other Earth orbital vantage points using the existing or projected fleet of Earth-to-orbit (ETO) launch vehicles. Without a return to Saturn V-class boosters or an alternate path, one cannot imagine emplacing the masses that would be required for any deep-space voyage without a prohibitive number of Shuttle-class launches. One futurist solution might involve mass launch systems that could be used to move the consumables, including fuel, water, food, and building materials, to LEO in pieces rather than launching integrated systems. This approach would necessitate the development of robotic assembly and fuel-storage systems in Earth orbit, but could provide for a natural separation of low-value cargo (e.g., fuel, water).
Kirks, Russell C; Sola, Richard; Iannitti, David A; Martinie, John B; Vrochides, Dionisios
2016-01-01
Pancreatic and peripancreatic fluid collections may develop after severe acute pancreatitis. Organized fluid collections such as pancreatic pseudocyst and walled-off pancreatic necrosis (WOPN) that mature over time may require intervention to treat obstructive or constitutional symptoms related to the size and location of the collection as well as possible infection. Endoscopic, open surgical and minimally invasive techniques are described to treat post-inflammatory pancreatic fluid collections. Surgical intervention may be required to treat collections containing necrotic pancreatic parenchyma or in locations not immediately apposed to the stomach or duodenum. Comprising a blend of the surgical approach and the clinical benefits of minimally invasive surgery, the robot-assisted technique of pancreatic cystgastrostomy with pancreatic debridement is described.
Minimal two-sphere model of the generation of fluid flow at low Reynolds numbers.
Leoni, M; Bassetti, B; Kotar, J; Cicuta, P; Cosentino Lagomarsino, M
2010-03-01
Locomotion and generation of flow at low Reynolds number are subject to severe limitations due to the irrelevance of inertia: the "scallop theorem" requires that the system have at least two degrees of freedom, which move in non-reciprocal fashion, i.e. breaking time-reversal symmetry. We show here that a minimal model consisting of just two spheres driven by harmonic potentials is capable of generating flow. In this pump system the two degrees of freedom are the mean and relative positions of the two spheres. We have performed and compared analytical predictions, numerical simulation and experiments, showing that a time-reversible drive is sufficient to induce flow.
Code of Federal Regulations, 2012 CFR
2012-10-01
... and disbursement by the recipient, and (2) Financial management systems that meet the standards for... remitted annually to Department of Health and Human Services, Payment Management System, Rockville, MD... Requirements Financial and Program Management § 2543.22 Payment. (a) Payment methods shall minimize the time...
Code of Federal Regulations, 2014 CFR
2014-10-01
... and disbursement by the recipient, and (2) Financial management systems that meet the standards for... remitted annually to Department of Health and Human Services, Payment Management System, Rockville, MD... Requirements Financial and Program Management § 2543.22 Payment. (a) Payment methods shall minimize the time...
Code of Federal Regulations, 2011 CFR
2011-10-01
... and disbursement by the recipient, and (2) Financial management systems that meet the standards for... remitted annually to Department of Health and Human Services, Payment Management System, Rockville, MD... Requirements Financial and Program Management § 2543.22 Payment. (a) Payment methods shall minimize the time...
Real-time combustion monitoring of PCDD/F indicators by REMPI-TOFMS
Analyses for polychlorinated dibenzodioxin and dibenzofuran (PCDD/F) emissions typically require a 4 h extractive sample taken on an annual or less frequent basis. This results in a potentially minimally representative monitoring scheme. More recently, methods for continual sampl...
ERIC Educational Resources Information Center
Anderson, Barry D.
Little is known about the costs of setting up and implementing legislated minimal competency testing (MCT). To estimate the financial obstacles which lie between the idea and its implementation, MCT requirements are viewed from two perspectives. The first, government regulation, views legislated minimal competency requirements as an attempt by the…
Use of minimally invasive spine surgical instruments for the treatment of bone tumors.
Reeves, Russell A; DeWolf, Matthew C; Shaughnessy, Peter J; Ames, James B; Henderson, Eric R
2017-11-01
Orthopedic oncologists often encounter patients with minor bony lesions that are difficult to access surgically and therefore require large exposures out of proportion to the severity of disease that confer significant patient morbidity. Minimally invasive surgical techniques offer the advantage of smaller incisions, shorter operative times, decreased tissue damage, and decreased costs. A variety of surgical procedures have emerged using minimally invasive technologies, particularly in the field of spine surgery. Areas covered: In this article, we describe the Minimal Exposure Tubular Retractor (METRx TM ) System which is a minimally invasive surgical device that utilizes a series of dilators to permit access to a surgical site of interest. This system was developed for use in treatment of disc herniation, spinal stenosis, posterior lumbar interbody fusion, transforaminal lumbar interbody fusion and spinal cord stimulation implantation. We also describe novel uses of this system for minimally invasive biopsy and treatment of benign and metastatic bone lesions at our institution. Expert commentary: Minimally invasive surgical techniques will continue to expand into the field of orthopedic oncology. With a greater number of studies proving the safety and effectiveness of this technique, the demand for minimally invasive treatments will grow.
Data forwarding mechanism for supporting real-time services during relocations in UMTS systems
NASA Astrophysics Data System (ADS)
Cai, Wei; Liao, Xianglong; Zheng, Liang; Liu, Zehong
2004-04-01
To minimize the interruption during the handovers or relocations invoked by subscribers moving is a very critical factor to enhance the performance of the UMTS systems. We know that the 2G systems have been optimized to minimize the interruption of speech during handovers by two main technologies: one is the bi-casting for the DL traffic and the other is the fast radio resynchronization by the UE for the UL traffic. In the UMTS systems, we have also implemented lossless relocations for non real-time services with high reliability by data buffering in the source RNC and target RNC for the UE. However, the UMTS systems support four QoS classes traffic flow: conversational class, streaming class, interactive class and background class. The main distinguishing factor between these QoS classes is how delay sensitive the traffic is: Conversational and Streaming classes are mainly used to carry real-time traffic flows, like video telephony, interactive and background classes are mainly used by traditional Internet applications like WWW, E-mail and FTP. It"s essential to provide the solutions for supporting real-time services to meet the requirement for QoS in UMTS systems. Apparently, the Data buffering mechanism is not adapted to real-time services because of it"s delay may exceed the basic requirement for real-time services. Under this background, the paper discussed two data forwarding solutions for real-time services from the PS domain in the UMTS systems: packet duplication and Core Network bi-casting. The former mechanism does not require any new procedures, messages nor information elements. The later mechanism requires that the GGSN or SGSN is able to bi-cast the DL traffic to the target RNC according to the relocations involving two SGSNs or just involving one SGSN. It also implicitly shows that we need change procedures at the nodes SGSN, GGSN and RNC which are involved in the relocation procedure based on existing procedures that we have already designed if adopt the later solution. In a detail way, the paper analyzed the characteristic for these two solutions respectively, concentrated on the packet flows and the message flows in those nodes involved in relocations. Additionally, also gave out the impact on present transport technologies in the wireless communication systems. However we shall minimize the impact of evolution of transport mechanism and utilize the resource efficiently according to the general requirements for QoS in UMTS systems.
Further Automate Planned Cluster Maintenance to Minimize System Downtime during Maintenance Windows
DOE Office of Scientific and Technical Information (OSTI.GOV)
Springmeyer, R.
This report documents the integration and testing of the automated update process of compute clusters in LC to minimize impact to user productivity. Description: A set of scripts will be written and deployed to further standardize cluster maintenance activities and minimize downtime during planned maintenance windows. Completion Criteria: When the scripts have been deployed and used during planned maintenance windows and a timing comparison is completed between the existing process and the new more automated process, this milestone is complete. This milestone was completed on Aug 23, 2016 on the new CTS1 cluster called Jade when a request to upgrademore » the version of TOSS 3 was initiated while SWL jobs and normal user jobs were running. Jobs that were running when the update to the system began continued to run to completion. New jobs on the cluster started on the new release of TOSS 3. No system administrator action was required. Current update procedures in TOSS 2 begin by killing all users jobs. Then all diskfull nodes are updated, which can take a few hours. Only after the updates are applied are all nodes are rebooted, and then finally put back into service. A system administrator is required for all steps. In terms of human time spent during a cluster OS update, the TOSS 3 automated procedure on Jade took 0 FTE hours. Doing the same update without the Toss Update Tool would have required 4 FTE hours.« less
77 FR 59354 - Removal of 30-Day Residency Requirement for Per Diem Payments
Federal Register 2010, 2011, 2012, 2013, 2014
2012-09-27
... when a veteran travels to visit family members. This proposed rule would also clarify in 38 CFR 51.43... 30 days is a minimal amount of time for demonstrating that a veteran intends to be a resident at the... specific period of time, or communicates that he or she will not be returning. With both types of absences...
The Ordering Challenge: An Online Game to Introduce Independent Demand Inventory Concepts
ERIC Educational Resources Information Center
Meyer, Brad C.; Bishop, Debra S.
2011-01-01
Students are put in the role of a manager who watches inventory levels decrease and must order at the right time and in the right quantity to minimize costs. This interactive game requires the students to race against time and has levels of increasing difficulty. It introduces the students to the concepts of holding cost, ordering cost, backlog…
To Eat or Not to Eat: An Easy Simulation of Optimal Diet Selection in the Classroom
ERIC Educational Resources Information Center
Ray, Darrell L.
2010-01-01
Optimal diet selection, a component of optimal foraging theory, suggests that animals should select a diet that either maximizes energy or nutrient consumption per unit time or minimizes the foraging time needed to attain required energy or nutrients. In this exercise, students simulate the behavior of foragers that either show no foraging…
Robotics in surgery: is a robot necessary? For what?
Ross, Sharona B; Downs, Darrell; Saeed, Sabrina M; Dolce, John K; Rosemurgy, Alexander S
2017-02-01
Every operation can be categorized along a spectrum from "most invasive" to "least invasive", based on the approach(es) through which it is commonly undertaken. Operations that are considered "most invasive" are characterized by "open" approaches with a relatively high degree of morbidity, while operations that are considered "least invasive" are undertaken with minimally invasive techniques and are associated with relatively improved patient outcomes, including faster recovery times and fewer complications. Because of the potential for reduced morbidity, movement along the spectrum towards minimally invasive surgery (MIS) is associated with a host of salutary benefits and, as well, lower costs of patient care. Accordingly, the goal of all stakeholders in surgery should be to attain universal application of the most minimally invasive approaches. Yet the difficulty of performing minimally invasive operations has largely limited its widespread application in surgery, particularly in the context of complex operations (i.e., those requiring complex extirpation and/or reconstruction). Robotic surgery, however, may facilitate application of minimally invasive techniques requisite for particular operations. Enhancements in visualization and dexterity offered by robotic surgical systems allow busy surgeons to quickly gain proficiency in demanding techniques (e.g., pancreaticojejunostomy), within a short learning curve. That is not to say, however, that all operations undertaken with minimally invasive techniques require robotic technology. Herein, we attempt to define how surgeon skill, operative difficulty, patient outcomes, and cost factors determine when robotic technology should be reasonably applied to patient care in surgery.
Starshade orbital maneuver study for WFIRST
NASA Astrophysics Data System (ADS)
Soto, Gabriel; Sinha, Amlan; Savransky, Dmitry; Delacroix, Christian; Garrett, Daniel
2017-09-01
The Wide Field Infrared Survey Telescope (WFIRST) mission, scheduled for launch in the mid-2020s will perform exoplanet science via both direct imaging and a microlensing survey. An internal coronagraph is planned to perform starlight suppression for exoplanet imaging, but an external starshade could be used to achieve the required high contrasts with potentially higher throughput. This approach would require a separately-launched occulter spacecraft to be positioned at exact distances from the telescope along the line of sight to a target star system. We present a detailed study to quantify the Δv requirements and feasibility of deploying this additional spacecraft as a means of exoplanet imaging. The primary focus of this study is the fuel use of the occulter while repositioning between targets. Based on its design, the occulter is given an offset distance from the nominal WFIRST halo orbit. Target star systems and look vectors are generated using Exoplanet Open-Source Imaging Simulator (EXOSIMS); a boundary value problem is then solved between successive targets. On average, 50 observations are achievable with randomly selected targets given a 30-day transfer time. Individual trajectories can be optimized for transfer time as well as fuel usage to be used in mission scheduling. Minimizing transfer time reduces the total mission time by up to 4.5 times in some simulations before expending the entire fuel budget. Minimizing Δv can generate starshade missions that achieve over 100 unique observations within the designated mission lifetime of WFIRST.
Novel physical constraints on implementation of computational processes
NASA Astrophysics Data System (ADS)
Wolpert, David; Kolchinsky, Artemy
Non-equilibrium statistical physics permits us to analyze computational processes, i.e., ways to drive a physical system such that its coarse-grained dynamics implements some desired map. It is now known how to implement any such desired computation without dissipating work, and what the minimal (dissipationless) work is that such a computation will require (the so-called generalized Landauer bound\\x9D). We consider how these analyses change if we impose realistic constraints on the computational process. First, we analyze how many degrees of freedom of the system must be controlled, in addition to the ones specifying the information-bearing degrees of freedom, in order to avoid dissipating work during a given computation, when local detailed balance holds. We analyze this issue for deterministic computations, deriving a state-space vs. speed trade-off, and use our results to motivate a measure of the complexity of a computation. Second, we consider computations that are implemented with logic circuits, in which only a small numbers of degrees of freedom are coupled at a time. We show that the way a computation is implemented using circuits affects its minimal work requirements, and relate these minimal work requirements to information-theoretic measures of complexity.
Process Improvements in Training Device Acceptance Testing: A Study in Total Quality Management
1990-12-12
Quality Management , a small group of Government and industry specialists examined the existing training device acceptance test process for potential improvements. The agreed-to mission of the Air Force/Industry partnership was to continuously identify and promote implementable approaches to minimize the cost and time required for acceptance testing while ensuring that validated performance supports the user training requirements. Application of a Total Quality process improvement model focused on the customers and their requirements, analyzed how work was accomplished, and
13 CFR 143.20 - Standards for financial management systems.
Code of Federal Regulations, 2011 CFR
2011-01-01
... Requirements Financial Administration § 143.20 Standards for financial management systems. (a) A State must... prohibitions of applicable statutes. (b) The financial management systems of other grantees and subgrantees... subgrant award documents, etc. (7) Cash management. Procedures for minimizing the time elapsing between the...
15 CFR 24.20 - Standards for financial management systems.
Code of Federal Regulations, 2011 CFR
2011-01-01
... Requirements Financial Administration § 24.20 Standards for financial management systems. (a) A State must... prohibitions of applicable statutes. (b) The financial management systems of other grantees and subgrantees... subgrant award documents, etc. (7) Cash management. Procedures for minimizing the time elapsing between the...
13 CFR 143.20 - Standards for financial management systems.
Code of Federal Regulations, 2012 CFR
2012-01-01
... Requirements Financial Administration § 143.20 Standards for financial management systems. (a) A State must... prohibitions of applicable statutes. (b) The financial management systems of other grantees and subgrantees... subgrant award documents, etc. (7) Cash management. Procedures for minimizing the time elapsing between the...
13 CFR 143.20 - Standards for financial management systems.
Code of Federal Regulations, 2014 CFR
2014-01-01
... Requirements Financial Administration § 143.20 Standards for financial management systems. (a) A State must... prohibitions of applicable statutes. (b) The financial management systems of other grantees and subgrantees... subgrant award documents, etc. (7) Cash management. Procedures for minimizing the time elapsing between the...
15 CFR 24.20 - Standards for financial management systems.
Code of Federal Regulations, 2013 CFR
2013-01-01
... Requirements Financial Administration § 24.20 Standards for financial management systems. (a) A State must... prohibitions of applicable statutes. (b) The financial management systems of other grantees and subgrantees... subgrant award documents, etc. (7) Cash management. Procedures for minimizing the time elapsing between the...
13 CFR 143.20 - Standards for financial management systems.
Code of Federal Regulations, 2013 CFR
2013-01-01
... Requirements Financial Administration § 143.20 Standards for financial management systems. (a) A State must... prohibitions of applicable statutes. (b) The financial management systems of other grantees and subgrantees... subgrant award documents, etc. (7) Cash management. Procedures for minimizing the time elapsing between the...
28 CFR 66.20 - Standards for financial management systems.
Code of Federal Regulations, 2012 CFR
2012-07-01
... Requirements Financial Administration § 66.20 Standards for financial management systems. (a) A State must... prohibitions of applicable statutes. (b) The financial management systems of other grantees and subgrantees... subgrant award documents, etc. (7) Cash management. Procedures for minimizing the time elapsing between the...
Smith, Peter; Chen, Cynthia; Mustard, Cameron; Hogg-Johnson, Sheilah; Tompa, Emile
2015-04-01
The objective of this study was to examine individual, occupational, and workplace level factors associated with time loss following a similar injury. Seven thousand three hundred and forty-eight workers' compensation claims that did not require time off work were matched with up to four claims that required time off work on the event, nature, and part of body injured as well as injury year. Conditional logistic regression models examined individual, occupational, and workplace level factors that were associated with the likelihood of not requiring time off work. Employees from firms with higher premium rates were more likely to report no time loss from work and workers in more physically demanding occupations were less likely to report no time loss from work. We observed no association between age or gender and the probability of a time loss claim submission. Our results suggest that insurance costs are an incentive for workplaces to adopt policies and practices that minimize time loss following a work injury. © 2015 Wiley Periodicals, Inc.
An integral nuclear power and propulsion system concept
NASA Astrophysics Data System (ADS)
Choong, Phillip T.; Teofilo, Vincent L.; Begg, Lester L.; Dunn, Charles; Otting, William
An integral space power concept provides both the electrical power and propulsion from a common heat source and offers superior performance capabilities over conventional orbital insertion using chemical propulsion systems. This paper describes a hybrid (bimodal) system concept based on a proven, inherently safe solid fuel form for the high temperature reactor core operation and rugged planar thermionic energy converter for long-life steady state electric power production combined with NERVA-based rocket technology for propulsion. The integral system is capable of long-life power operation and multiple propulsion operations. At an optimal thrust level, the integral system can maintain the minimal delta-V requirement while minimizing the orbital transfer time. A trade study comparing the overall benefits in placing large payloads to GEO with the nuclear electric propulsion option shows superiority of nuclear thermal propulsion. The resulting savings in orbital transfer time and the substantial reduction of overall lift requirement enables the use of low-cost launchers for several near-term military satellite missions.
Keeley, F X; Tolley, D A
1998-04-01
Endoscopic treatment of upper-tract transitional-cell carcinoma (TCC) is well established. Nevertheless, many patients still required major ablative surgery. We have applied our experience with laparoscopic nephrectomy to the performance of laparoscopic nephroureterectomy in order to make the management of upper-tract TCC entirely minimally invasive. Since 1993, we have performed 22 laparoscopic nephroureterectomies for upper-tract TCC. Initially, we excluded patients with tumors below the pelvic brim, but we now offer a trial of laparoscopy to all patients. We describe the evolution of our technique, which involves resecting the ureteral orifice prior to laparoscopic dissection of the kidney and ureter. We have had to convert three cases to open surgery, one each for bleeding, failure to progress, and unappreciated tumor extent. Operative times averaged 156 minutes, which compares well with contemporary times for open nephroureterectomy. Complication rates, transfusion requirements, and length of stay, although higher than those of laparoscopic nephrectomy, were all reduced in comparison with open nephroureterectomy.
Boeing 747 aircraft with large external pod for transporting outsize cargo
NASA Technical Reports Server (NTRS)
Price, J. E.; Quartero, C. B.; Smith, P. M.; Washburn, G. F.
1979-01-01
The effect on structural arrangement, system weight, and range performance of the cargo pod payload carrying capability was determined to include either the bridge launcher or a spacelab module on a Boeing 747 aircraft. Modifications to the carrier aircraft and the installation time required to attach the external pod to the 747 were minimized. Results indicate that the increase in pod size was minimal, and that the basic 747 structure was adequate to safely absorb the load induced by ground or air operation while transporting either payload.
A multi-objective decision-making approach to the journal submission problem.
Wong, Tony E; Srikrishnan, Vivek; Hadka, David; Keller, Klaus
2017-01-01
When researchers complete a manuscript, they need to choose a journal to which they will submit the study. This decision requires to navigate trade-offs between multiple objectives. One objective is to share the new knowledge as widely as possible. Citation counts can serve as a proxy to quantify this objective. A second objective is to minimize the time commitment put into sharing the research, which may be estimated by the total time from initial submission to final decision. A third objective is to minimize the number of rejections and resubmissions. Thus, researchers often consider the trade-offs between the objectives of (i) maximizing citations, (ii) minimizing time-to-decision, and (iii) minimizing the number of resubmissions. To complicate matters further, this is a decision with multiple, potentially conflicting, decision-maker rationalities. Co-authors might have different preferences, for example about publishing fast versus maximizing citations. These diverging preferences can lead to conflicting trade-offs between objectives. Here, we apply a multi-objective decision analytical framework to identify the Pareto-front between these objectives and determine the set of journal submission pathways that balance these objectives for three stages of a researcher's career. We find multiple strategies that researchers might pursue, depending on how they value minimizing risk and effort relative to maximizing citations. The sequences that maximize expected citations within each strategy are generally similar, regardless of time horizon. We find that the "conditional impact factor"-impact factor times acceptance rate-is a suitable heuristic method for ranking journals, to strike a balance between minimizing effort objectives and maximizing citation count. Finally, we examine potential co-author tension resulting from differing rationalities by mapping out each researcher's preferred Pareto front and identifying compromise submission strategies. The explicit representation of trade-offs, especially when multiple decision-makers (co-authors) have different preferences, facilitates negotiations and can support the decision process.
A multi-objective decision-making approach to the journal submission problem
Hadka, David; Keller, Klaus
2017-01-01
When researchers complete a manuscript, they need to choose a journal to which they will submit the study. This decision requires to navigate trade-offs between multiple objectives. One objective is to share the new knowledge as widely as possible. Citation counts can serve as a proxy to quantify this objective. A second objective is to minimize the time commitment put into sharing the research, which may be estimated by the total time from initial submission to final decision. A third objective is to minimize the number of rejections and resubmissions. Thus, researchers often consider the trade-offs between the objectives of (i) maximizing citations, (ii) minimizing time-to-decision, and (iii) minimizing the number of resubmissions. To complicate matters further, this is a decision with multiple, potentially conflicting, decision-maker rationalities. Co-authors might have different preferences, for example about publishing fast versus maximizing citations. These diverging preferences can lead to conflicting trade-offs between objectives. Here, we apply a multi-objective decision analytical framework to identify the Pareto-front between these objectives and determine the set of journal submission pathways that balance these objectives for three stages of a researcher’s career. We find multiple strategies that researchers might pursue, depending on how they value minimizing risk and effort relative to maximizing citations. The sequences that maximize expected citations within each strategy are generally similar, regardless of time horizon. We find that the “conditional impact factor”—impact factor times acceptance rate—is a suitable heuristic method for ranking journals, to strike a balance between minimizing effort objectives and maximizing citation count. Finally, we examine potential co-author tension resulting from differing rationalities by mapping out each researcher’s preferred Pareto front and identifying compromise submission strategies. The explicit representation of trade-offs, especially when multiple decision-makers (co-authors) have different preferences, facilitates negotiations and can support the decision process. PMID:28582430
A method of hidden Markov model optimization for use with geophysical data sets
NASA Technical Reports Server (NTRS)
Granat, R. A.
2003-01-01
Geophysics research has been faced with a growing need for automated techniques with which to process large quantities of data. A successful tool must meet a number of requirements: it should be consistent, require minimal parameter tuning, and produce scientifically meaningful results in reasonable time. We introduce a hidden Markov model (HMM)-based method for analysis of geophysical data sets that attempts to address these issues.
Management of pilonidal disease.
Kallis, Michelle P; Maloney, Caroline; Lipskar, Aaron M
2018-06-01
Pilonidal disease, and the treatment associated with it, can cause significant morbidity and substantial burden to patients' quality of life. Despite the plethora of surgical techniques that have been developed to treat pilonidal disease, discrepancies in technique, recurrence rates, complications, time to return to work/school and patients' aesthetic satisfaction between treatment options have led to controversy over the best approach to this common acquired disease of young adults. The management of pilonidal disease must strike a balance between recurrence and surgical morbidity. The commonly performed wide excision without closure has prolonged recovery, while flap closures speed recovery time and improve aesthetics at the expense of increased wound complications. Less invasive surgical techniques have recently evolved and are straightforward, with minimal morbidity and satisfactory results. As with any surgical intervention, the ideal treatment for pilonidal disease would be simple and cost-effective, cause minimal pain, have a limited hospital stay, low recurrence rate and require minimal time off from school or work. Less invasive procedures for pilonidal disease may be favourable as an initial approach for these patients reserving complex surgical treatment for refractory disease.
Current methods of diagnosis and management of ureteral injuries.
Armenakas, N A
1999-04-01
A delay in diagnosis is the most important contributory factor in morbidity related to ureteral injury. The difficulty in making the diagnosis can be minimized by maintenance of a high index of suspicion and the timely performance of the appropriate radiographic and intraoperative evaluations. A decision on the timing of repair of the ureteral injury is based on the patient's overall condition, promptness of injury recognition, and proper injury staging. Ideally, when identified promptly, ureteral injuries should be repaired immediately. However, once there has been a delay in diagnosis or in the case of an unstable patient, temporizing measures can be used for urinary diversion. With the availability of simple, minimally invasive techniques to manage urinary extravasation and the absence of any risk of ureteral hemorrhage, ureteral reconstruction can be safely deferred until an opportune time during the recovery period. Successful surgical management requires familiarity with the broad reconstructive armamentarium and meticulous attention to the specific details of each procedure. Through adherence to the diagnostic and therapeutic principles outlined, complications can be minimized and renal preservation can be maximized in patients sustaining ureteral injuries.
Identifying Minimal Changes in Nonerosive Reflux Disease: Is the Pay Worth the Labor?
Gabbard, Scott L; Fass, Ronnie; Maradey-Romero, Carla; Gingold Belfer, Rachel; Dickman, Ram
2016-01-01
Gastroesophageal reflux disease has a variable presentation on upper endoscopy. Gastroesophageal reflux disease can be divided into 3 endoscopic categories: Barrett's esophagus, erosive esophagitis, and normal mucosa/nonerosive reflux disease (NERD). Each of these phenotypes behave in a distinct manner, in regards to symptom response to treatment, and risk of development of complications such as esophageal adenocarcinoma. Recently, it has been proposed to further differentiate NERD into 2 categories: those with and those without "minimal changes." These minimal changes include endoscopic abnormalities, such as villous mucosal surface, mucosal islands, microerosions, and increased vascularity at the squamocolumnar junction. Although some studies have shown that patients with minimal changes may have higher rates of esophageal acid exposure compared with those without minimal changes, it is currently unclear if these patients behave differently than those currently categorized as having NERD. The clinical utility of identifying these lesions should be weighed against the cost of the requisite equipment and the additional time required for diagnosis, compared with conventional white light endoscopy.
Reliability enhancement of Navier-Stokes codes through convergence enhancement
NASA Technical Reports Server (NTRS)
Choi, K.-Y.; Dulikravich, G. S.
1993-01-01
Reduction of total computing time required by an iterative algorithm for solving Navier-Stokes equations is an important aspect of making the existing and future analysis codes more cost effective. Several attempts have been made to accelerate the convergence of an explicit Runge-Kutta time-stepping algorithm. These acceleration methods are based on local time stepping, implicit residual smoothing, enthalpy damping, and multigrid techniques. Also, an extrapolation procedure based on the power method and the Minimal Residual Method (MRM) were applied to the Jameson's multigrid algorithm. The MRM uses same values of optimal weights for the corrections to every equation in a system and has not been shown to accelerate the scheme without multigriding. Our Distributed Minimal Residual (DMR) method based on our General Nonlinear Minimal Residual (GNLMR) method allows each component of the solution vector in a system of equations to have its own convergence speed. The DMR method was found capable of reducing the computation time by 10-75 percent depending on the test case and grid used. Recently, we have developed and tested a new method termed Sensitivity Based DMR or SBMR method that is easier to implement in different codes and is even more robust and computationally efficient than our DMR method.
Reliability enhancement of Navier-Stokes codes through convergence enhancement
NASA Astrophysics Data System (ADS)
Choi, K.-Y.; Dulikravich, G. S.
1993-11-01
Reduction of total computing time required by an iterative algorithm for solving Navier-Stokes equations is an important aspect of making the existing and future analysis codes more cost effective. Several attempts have been made to accelerate the convergence of an explicit Runge-Kutta time-stepping algorithm. These acceleration methods are based on local time stepping, implicit residual smoothing, enthalpy damping, and multigrid techniques. Also, an extrapolation procedure based on the power method and the Minimal Residual Method (MRM) were applied to the Jameson's multigrid algorithm. The MRM uses same values of optimal weights for the corrections to every equation in a system and has not been shown to accelerate the scheme without multigriding. Our Distributed Minimal Residual (DMR) method based on our General Nonlinear Minimal Residual (GNLMR) method allows each component of the solution vector in a system of equations to have its own convergence speed. The DMR method was found capable of reducing the computation time by 10-75 percent depending on the test case and grid used. Recently, we have developed and tested a new method termed Sensitivity Based DMR or SBMR method that is easier to implement in different codes and is even more robust and computationally efficient than our DMR method.
Separation-Compliant, Optimal Routing and Control of Scheduled Arrivals in a Terminal Airspace
NASA Technical Reports Server (NTRS)
Sadovsky, Alexander V.; Davis, Damek; Isaacson, Douglas R.
2013-01-01
We address the problem of navigating a set (fleet) of aircraft in an aerial route network so as to bring each aircraft to its destination at a specified time and with minimal distance separation assured between all aircraft at all times. The speed range, initial position, required destination, and required time of arrival at destination for each aircraft are assumed provided. Each aircraft's movement is governed by a controlled differential equation (state equation). The problem consists in choosing for each aircraft a path in the route network and a control strategy so as to meet the constraints and reach the destination at the required time. The main contribution of the paper is a model that allows to recast this problem as a decoupled collection of problems in classical optimal control and is easily generalized to the case when inertia cannot be neglected. Some qualitative insight into solution behavior is obtained using the Pontryagin Maximum Principle. Sample numerical solutions are computed using a numerical optimal control solver. The proposed model is first step toward increasing the fidelity of continuous time control models of air traffic in a terminal airspace. The Pontryagin Maximum Principle implies the polygonal shape of those portions of the state trajectories away from those states in which one or more aircraft pair are at minimal separation. The model also confirms the intuition that, the narrower the allowed speed ranges of the aircraft, the smaller the space of optimal solutions, and that an instance of the optimal control problem may not have a solution at all (i.e., no control strategy that meets the separation requirement and other constraints).
Application of quadratic optimization to supersonic inlet control
NASA Technical Reports Server (NTRS)
Lehtinen, B.; Zeller, J. R.
1971-01-01
The application of linear stochastic optimal control theory to the design of the control system for the air intake (inlet) of a supersonic air-breathing propulsion system is discussed. The controls must maintain a stable inlet shock position in the presence of random airflow disturbances and prevent inlet unstart. Two different linear time invariant control systems are developed. One is designed to minimize a nonquadratic index, the expected frequency of inlet unstart, and the other is designed to minimize the mean square value of inlet shock motion. The quadratic equivalence principle is used to obtain the best linear controller that minimizes the nonquadratic performance index. The two systems are compared on the basis of unstart prevention, control effort requirements, and sensitivity to parameter variations.
Scientist/AMPS equipment interface study
NASA Technical Reports Server (NTRS)
Anderson, H. R.
1977-01-01
The principal objective was to determine for each experiment how the operating procedures and modes of equipment onboard shuttle can be managed in real-time or near-real-time to enhance the quality of results. As part of this determination the data and display devices that a man will need for real-time management are defined. The secondary objectives, as listed in the RFQ and technical proposal, were to: (1) determine what quantities are to be measured (2) determine permissible background levels (3) decide in what portions of space measurements are to be made (4) estimate bit rates (5) establish time-lines for operating the experiments on a mission or set of missions and (6) determine the minimum set of hardware needed for real-time display. Experiment descriptions and requirements were written. The requirements of the various experiments are combined and a minimal set of joint requirements are defined.
Dynamic implicit 3D adaptive mesh refinement for non-equilibrium radiation diffusion
NASA Astrophysics Data System (ADS)
Philip, B.; Wang, Z.; Berrill, M. A.; Birke, M.; Pernice, M.
2014-04-01
The time dependent non-equilibrium radiation diffusion equations are important for solving the transport of energy through radiation in optically thick regimes and find applications in several fields including astrophysics and inertial confinement fusion. The associated initial boundary value problems that are encountered often exhibit a wide range of scales in space and time and are extremely challenging to solve. To efficiently and accurately simulate these systems we describe our research on combining techniques that will also find use more broadly for long term time integration of nonlinear multi-physics systems: implicit time integration for efficient long term time integration of stiff multi-physics systems, local control theory based step size control to minimize the required global number of time steps while controlling accuracy, dynamic 3D adaptive mesh refinement (AMR) to minimize memory and computational costs, Jacobian Free Newton-Krylov methods on AMR grids for efficient nonlinear solution, and optimal multilevel preconditioner components that provide level independent solver convergence.
Automated Dental Epidemiology System. II. Systems Analysis and Functional Design,
1983-08-01
reduction of time and expense required for dental treatment and a minimization of patient time lost from military duties. Navy dentistry can thus be...regard, dental epidemiology can be especially valuable for evaluating and improving the Navy preventive dentistry program. It has been recommended that...processing applications to dentistry and dental epidemiology was performed. Alternative means to improve military dental epidemiology techniques and
78 FR 40823 - Reports, Forms, and Record Keeping Requirements
Federal Register 2010, 2011, 2012, 2013, 2014
2013-07-08
... at time of approval. Title: National Survey of Principal Drivers of Vehicles with a Rear Seat Belt... from both groups and information on their passengers seat belt usage habits, as well as the... use computer-assisted telephone interviewing to reduce interview length and minimize recording errors...
Keeping Kids Engaged Fights Plagiarism, Too
ERIC Educational Resources Information Center
Johnson, Doug
2004-01-01
Educators try to "catch" plagiarism, but their time is better spent creating assignments that require original, thoughtful research and, therefore, minimize plagiarism. In this article, the author presents two scenarios which exemplify projects that encourage students to do original work. Moreover, some qualities of projects carrying a low…
28 CFR 66.20 - Standards for financial management systems.
Code of Federal Regulations, 2013 CFR
2013-07-01
... subgrant award documents, etc. (7) Cash management. Procedures for minimizing the time elapsing between the... 28 Judicial Administration 2 2013-07-01 2013-07-01 false Standards for financial management... Requirements Financial Administration § 66.20 Standards for financial management systems. (a) A State must...
28 CFR 66.20 - Standards for financial management systems.
Code of Federal Regulations, 2014 CFR
2014-07-01
... subgrant award documents, etc. (7) Cash management. Procedures for minimizing the time elapsing between the... 28 Judicial Administration 2 2014-07-01 2014-07-01 false Standards for financial management... Requirements Financial Administration § 66.20 Standards for financial management systems. (a) A State must...
45 CFR 2541.200 - Standards for financial management systems.
Code of Federal Regulations, 2013 CFR
2013-10-01
..., contract and subgrant award documents, etc. (7) Cash management. Procedures for minimizing the time... 45 Public Welfare 4 2013-10-01 2013-10-01 false Standards for financial management systems. 2541... STATE AND LOCAL GOVERNMENTS Post-Award Requirements § 2541.200 Standards for financial management...
28 CFR 66.20 - Standards for financial management systems.
Code of Federal Regulations, 2011 CFR
2011-07-01
... subgrant award documents, etc. (7) Cash management. Procedures for minimizing the time elapsing between the... 28 Judicial Administration 2 2011-07-01 2011-07-01 false Standards for financial management... Requirements Financial Administration § 66.20 Standards for financial management systems. (a) A State must...
Orbital Battleship: A Guessing Game to Reinforce Atomic Structure
ERIC Educational Resources Information Center
Kurushkin, Mikhail; Mikhaylenko, Maria
2016-01-01
A competitive educational guessing game "Orbital Battleship" which reinforces Madelung's and Hund's rules, values of quantum numbers, and understanding of periodicity was designed. The game develops strategic thinking, is not time-consuming, requires minimal preparation and supervision, and is an efficient and fun alternative to more…
31 CFR 205.11 - What requirements apply to funding techniques?
Code of Federal Regulations, 2014 CFR
2014-07-01
... Program Agency must minimize the time elapsing between the transfer of funds from the United States Treasury and the State's payout of funds for Federal assistance program purposes, whether the transfer... EFFICIENT FEDERAL-STATE FUNDS TRANSFERS Rules Applicable to Federal Assistance Programs Included in a...
NASA Technical Reports Server (NTRS)
Hughes, John; Marius, Julio L.; Montoro, Manuel; Patel, Mehul; Bludworth, David
2006-01-01
This Paper is a case study of the development and execution of the End-of-Mission plans for the Earth Radiation Budget Satellite (ERBS) and the Upper Atmosphere Research Satellite (UARS). The goals of the End-of-Mission Plans are to minimize the time the spacecraft remains on orbit and to minimize the risk of creating orbital debris. Both of these Missions predate the NASA Management Instructions (NMI) that directs missions to provide for safe mission termination. Each spacecrafts had their own unique challenges, which required assessing End-of-Mission requirements versus spacecraft limitations. Ultimately the End-of- Mission operations were about risk mitigation. This paper will describe the operational challenges and the lessons learned executing these End-of-Mission Plans
Influence of heat transfer rates on pressurization of liquid/slush hydrogen propellant tanks
NASA Technical Reports Server (NTRS)
Sasmal, G. P.; Hochstein, J. I.; Hardy, T. L.
1993-01-01
A multi-dimensional computational model of the pressurization process in liquid/slush hydrogen tank is developed and used to study the influence of heat flux rates at the ullage boundaries on the process. The new model computes these rates and performs an energy balance for the tank wall whereas previous multi-dimensional models required a priori specification of the boundary heat flux rates. Analyses of both liquid hydrogen and slush hydrogen pressurization were performed to expose differences between the two processes. Graphical displays are presented to establish the dependence of pressurization time, pressurant mass required, and other parameters of interest on ullage boundary heat flux rates and pressurant mass flow rate. Detailed velocity fields and temperature distributions are presented for selected cases to further illuminate the details of the pressurization process. It is demonstrated that ullage boundary heat flux rates do significantly effect the pressurization process and that minimizing heat loss from the ullage and maximizing pressurant flow rate minimizes the mass of pressurant gas required to pressurize the tank. It is further demonstrated that proper dimensionless scaling of pressure and time permit all the pressure histories examined during this study to be displayed as a single curve.
Radiology teaching file cases on the World Wide Web.
Scalzetti, E M
1997-08-01
The presentation of a radiographic teaching file on the World Wide Web can be enhanced by attending to principles of web design. Chief among these are appropriate control of page layout, minimization of the time required to download a page from the remote server, and provision for navigation within and among the web pages that constitute the site. Page layout is easily accomplished by the use of tables; column widths can be fixed to maintain an acceptable line length for text. Downloading time is minimized by rigorous editing and by optimal compression of image files; beyond this, techniques like preloading of images and specification of image width and height are also helpful. Navigation controls should be clear, consistent, and readily available.
A 3D virtual reality simulator for training of minimally invasive surgery.
Mi, Shao-Hua; Hou, Zeng-Gunag; Yang, Fan; Xie, Xiao-Liang; Bian, Gui-Bin
2014-01-01
For the last decade, remarkable progress has been made in the field of cardiovascular disease treatment. However, these complex medical procedures require a combination of rich experience and technical skills. In this paper, a 3D virtual reality simulator for core skills training in minimally invasive surgery is presented. The system can generate realistic 3D vascular models segmented from patient datasets, including a beating heart, and provide a real-time computation of force and force feedback module for surgical simulation. Instruments, such as a catheter or guide wire, are represented by a multi-body mass-spring model. In addition, a realistic user interface with multiple windows and real-time 3D views are developed. Moreover, the simulator is also provided with a human-machine interaction module that gives doctors the sense of touch during the surgery training, enables them to control the motion of a virtual catheter/guide wire inside a complex vascular model. Experimental results show that the simulator is suitable for minimally invasive surgery training.
A minimally invasive method for extraction of sturgeon oocytes
Candrl, James S.; Papoulias, Diana M.; Tillitt, Donald E.
2010-01-01
Fishery biologists, hatchery personnel, and caviar fishers routinely extract oocytes from sturgeon (Acipenseridae) to determine the stage of maturation by checking egg quality. Typically, oocytes are removed either by inserting a catheter into the oviduct or by making an incision in the body cavity. Both methods can be time-consuming and stressful to the fish. We describe a device to collect mature oocytes from sturgeons quickly and effectively with minimal stress on the fish. The device is made by creating a needle from stainless steel tubing and connecting it to a syringe with polyvinyl chloride tubing. The device is filled with saline solution or water, the needle is inserted into the abdominal wall, and eggs are extracted from the fish. Using this device, an oocyte sample can be collected in less than 30 s. Such sampling leaves a minute wound that heals quickly and does not require suturing. The extractor device can easily be used in the field or hatchery, reduces fish handling time, and minimizes stress.
Does Minimally Invasive Spine Surgery Minimize Surgical Site Infections?
Kulkarni, Arvind Gopalrao; Patel, Ravish Shammi; Dutta, Shumayou
2016-12-01
Retrospective review of prospectively collected data. To evaluate the incidence of surgical site infections (SSIs) in minimally invasive spine surgery (MISS) in a cohort of patients and compare with available historical data on SSI in open spinal surgery cohorts, and to evaluate additional direct costs incurred due to SSI. SSI can lead to prolonged antibiotic therapy, extended hospitalization, repeated operations, and implant removal. Small incisions and minimal dissection intrinsic to MISS may minimize the risk of postoperative infections. However, there is a dearth of literature on infections after MISS and their additional direct financial implications. All patients from January 2007 to January 2015 undergoing posterior spinal surgery with tubular retractor system and microscope in our institution were included. The procedures performed included tubular discectomies, tubular decompressions for spinal stenosis and minimal invasive transforaminal lumbar interbody fusion (TLIF). The incidence of postoperative SSI was calculated and compared to the range of cited SSI rates from published studies. Direct costs were calculated from medical billing for index cases and for patients with SSI. A total of 1,043 patients underwent 763 noninstrumented surgeries (discectomies, decompressions) and 280 instrumented (TLIF) procedures. The mean age was 52.2 years with male:female ratio of 1.08:1. Three infections were encountered with fusion surgeries (mean detection time, 7 days). All three required wound wash and debridement with one patient requiring unilateral implant removal. Additional direct cost due to infection was $2,678 per 100 MISS-TLIF. SSI increased hospital expenditure per patient 1.5-fold after instrumented MISS. Overall infection rate after MISS was 0.29%, with SSI rate of 0% in non-instrumented MISS and 1.07% with instrumented MISS. MISS can markedly reduce the SSI rate and can be an effective tool to minimize hospital costs.
Purwar, Namrta; Tenboer, Jason; Tripathi, Shailesh; Schmidt, Marius
2013-09-13
Time-resolved spectroscopic experiments have been performed with protein in solution and in crystalline form using a newly designed microspectrophotometer. The time-resolution of these experiments can be as good as two nanoseconds (ns), which is the minimal response time of the image intensifier used. With the current setup, the effective time-resolution is about seven ns, determined mainly by the pulse duration of the nanosecond laser. The amount of protein required is small, on the order of 100 nanograms. Bleaching, which is an undesirable effect common to photoreceptor proteins, is minimized by using a millisecond shutter to avoid extensive exposure to the probing light. We investigate two model photoreceptors, photoactive yellow protein (PYP), and α-phycoerythrocyanin (α-PEC), on different time scales and at different temperatures. Relaxation times obtained from kinetic time-series of difference absorption spectra collected from PYP are consistent with previous results. The comparison with these results validates the capability of this spectrophotometer to deliver high quality time-resolved absorption spectra.
NASA Technical Reports Server (NTRS)
Mcenulty, R. E.
1977-01-01
The G189A simulation of the Shuttle Orbiter ECLSS was upgraded. All simulation library versions and simulation models were converted from the EXEC2 to the EXEC8 computer system and a new program, G189PL, was added to the combination master program library. The program permits the post-plotting of up to 100 frames of plot data over any time interval of a G189 simulation run. The overlay structure of the G189A simulations were restructured for the purpose of conserving computer core requirements and minimizing run time requirements.
Minimally Invasive Tubular Resection of Lumbar Synovial Cysts: Report of 40 Consecutive Cases.
Birch, Barry D; Aoun, Rami James N; Elbert, Gregg A; Patel, Naresh P; Krishna, Chandan; Lyons, Mark K
2016-10-01
Lumbar synovial cysts are a relatively common clinical finding. Surgical treatment of symptomatic synovial cysts includes computed tomography-guided aspiration, open resection and minimally invasive tubular resection. We report our series of 40 consecutive minimally invasive microscopic tubular lumbar synovial cyst resections. Following Institutional Review Board approval, a retrospective analysis of 40 cases of minimally invasive microscopic tubular retractor synovial cyst resections at a single institution by a single surgeon (B.D.B.) was conducted. Gross total resection was performed in all cases. Patient characteristics, surgical operating time, complications, and outcomes were analyzed. Lumbar radiculopathy was the presenting symptoms in all but 1 patient, who presented with neurogenic claudication. The mean duration of symptoms was 6.5 months (range, 1-25 months), mean operating time was 58 minutes (range, 25-110 minutes), and mean blood loss was 20 mL (range, 5-50 mL). Seven patients required overnight observation. The median length of stay in the remaining 33 patients was 4 hours. There were 2 cerebrospinal fluid leaks repaired directly without sequelae. The mean follow-up duration was 80.7 months. Outcomes were good or excellent in 37 of the 40 patients, fair in 1 patient, and poor in 2 patients. Minimally invasive microscopic tubular retractor resection of lumbar synovial cysts can be done safely and with comparable outcomes and complication rates as open procedures with potentially reduced operative time, length of stay, and healthcare costs. Patient selection for microscopic tubular synovial cyst resection is based in part on the anatomy of the spine and synovial cyst and is critical when recommending minimally invasive vs. open resection to patients. Copyright © 2016 Elsevier Inc. All rights reserved.
75 FR 36444 - Proposed Extension of the Approval of Information Collection Requirements
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-25
... be provided in the desired format, reporting burden (time and financial resources) is minimized... of the following methods: E-mail: [email protected] ; Mail, Hand Delivery, Courier: Regulatory... collection. Because we continue to experience delays in receiving mail in the Washington, DC area, commenters...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-20
... can be provided in the desired format, reporting burden (time and financial resources) is minimized, collection instruments are clearly understood, and the impact of collection requirements on respondents can... Party Settlement (CA- 1032). A copy of the proposed information collection request can be obtained by...
78 FR 35981 - Proposed Extension of the Approval of Information Collection Requirements
Federal Register 2010, 2011, 2012, 2013, 2014
2013-06-14
... format, reporting burden (time and financial resources) is minimized, collection instruments are clearly...: Medical Travel Refund Request (OWCP-957). A copy of the proposed information collection request can be... beneficiaries for travel expenses for covered medical treatment. In order to determine whether amounts requested...
75 FR 7292 - Proposed Extension of the Approval of Information Collection Requirements
Federal Register 2010, 2011, 2012, 2013, 2014
2010-02-18
... format, reporting burden (time and financial resources) is minimized, collection instruments are clearly...: Medical Travel Refund Request (OWCP-957). A copy of the proposed information collection request can be... beneficiaries for travel expenses for covered medical treatment. In order to determine whether amounts requested...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-11-23
... requested data can be provided in the desired format, reporting burden (time and financial resources) is minimized, collection instruments are clearly understood, and the impact of collection requirements on... comments concerning the proposed collection: Pre-Hearing Statement (LS- 18). A copy of the proposed...
45 CFR 92.20 - Standards for financial management systems.
Code of Federal Regulations, 2013 CFR
2013-10-01
... subgrant award documents, etc. (7) Cash management. Procedures for minimizing the time elapsing between the... 45 Public Welfare 1 2013-10-01 2013-10-01 false Standards for financial management systems. 92.20...-Award Requirements Financial Administration § 92.20 Standards for financial management systems. (a) A...
45 CFR 92.20 - Standards for financial management systems.
Code of Federal Regulations, 2012 CFR
2012-10-01
... subgrant award documents, etc. (7) Cash management. Procedures for minimizing the time elapsing between the... 45 Public Welfare 1 2012-10-01 2012-10-01 false Standards for financial management systems. 92.20...-Award Requirements Financial Administration § 92.20 Standards for financial management systems. (a) A...
45 CFR 602.20 - Standards for financial management systems.
Code of Federal Regulations, 2014 CFR
2014-10-01
... subgrant award documents, etc. (7) Cash management. Procedures for minimizing the time elapsing between the... 45 Public Welfare 3 2014-10-01 2014-10-01 false Standards for financial management systems. 602.20... GOVERNMENTS Post-Award Requirements § 602.20 Standards for financial management systems. (a) A State must...
45 CFR 602.20 - Standards for financial management systems.
Code of Federal Regulations, 2012 CFR
2012-10-01
... subgrant award documents, etc. (7) Cash management. Procedures for minimizing the time elapsing between the... 45 Public Welfare 3 2012-10-01 2012-10-01 false Standards for financial management systems. 602.20... GOVERNMENTS Post-Award Requirements § 602.20 Standards for financial management systems. (a) A State must...
10 CFR 600.220 - Standards for financial management systems.
Code of Federal Regulations, 2011 CFR
2011-01-01
... subgrant award documents, etc. (7) Cash management. Procedures for minimizing the time elapsing between the... 10 Energy 4 2011-01-01 2011-01-01 false Standards for financial management systems. 600.220... Post-Award Requirements § 600.220 Standards for financial management systems. (a) A State must expend...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-10-21
.... Estimated Cost (Operation and Maintenance): $0. IV. Public Participation--Submission of Comments on This... costs) is minimal, collection instruments are clearly understood, and OSHA's estimate of the information... of OSHA's estimate of the burden (time and costs) of the information collection requirements...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-06-22
.... Estimated Total urden Hours: 222,924. Estimated Cost (Operation and Maintenance): $0. IV. Public... costs) is minimal, collection instruments are clearly understood, and OSHA's estimate of the information... of OSHA's estimate of the burden (time and costs) of the information collection requirements...
Real-Time Aerodynamic Flow and Data Visualization in an Interactive Virtual Environment
NASA Technical Reports Server (NTRS)
Schwartz, Richard J.; Fleming, Gary A.
2005-01-01
Significant advances have been made to non-intrusive flow field diagnostics in the past decade. Camera based techniques are now capable of determining physical qualities such as surface deformation, surface pressure and temperature, flow velocities, and molecular species concentration. In each case, extracting the pertinent information from the large volume of acquired data requires powerful and efficient data visualization tools. The additional requirement for real time visualization is fueled by an increased emphasis on minimizing test time in expensive facilities. This paper will address a capability titled LiveView3D, which is the first step in the development phase of an in depth, real time data visualization and analysis tool for use in aerospace testing facilities.
Miniature L-Band Radar Transceiver
NASA Technical Reports Server (NTRS)
McWatters, Dalia; Price, Douglas; Edelstein, Wendy
2007-01-01
A miniature L-band transceiver that operates at a carrier frequency of 1.25 GHz has been developed as part of a generic radar electronics module (REM) that would constitute one unit in an array of many identical units in a very-large-aperture phased-array antenna. NASA and the Department of Defense are considering the deployment of such antennas in outer space; the underlying principles of operation, and some of those of design, also are applicable on Earth. The large dimensions of the antennas make it advantageous to distribute radio-frequency electronic circuitry into elements of the arrays. The design of the REM is intended to implement the distribution. The design also reflects a requirement to minimize the size and weight of the circuitry in order to minimize the weight of any such antenna. Other requirements include making the transceiver robust and radiation-hard and minimizing power demand. Figure 1 depicts the functional blocks of the REM, including the L-band transceiver. The key functions of the REM include signal generation, frequency translation, amplification, detection, handling of data, and radar control and timing. An arbitrary-waveform generator that includes logic circuitry and a digital-to-analog converter (DAC) generates a linear-frequency-modulation chirp waveform. A frequency synthesizer produces local-oscillator signals used for frequency conversion and clock signals for the arbitrary-waveform generator, for a digitizer [that is, an analog-to-digital converter (ADC)], and for a control and timing unit. Digital functions include command, timing, telemetry, filtering, and high-rate framing and serialization of data for a high-speed scientific-data interface. The aforementioned digital implementation of filtering is a key feature of the REM architecture. Digital filters, in contradistinction to analog ones, provide consistent and temperature-independent performance, which is particularly important when REMs are distributed throughout a large array. Digital filtering also enables selection among multiple filter parameters as required for different radar operating modes. After digital filtering, data are decimated appropriately in order to minimize the data rate out of an antenna panel. The L-band transceiver (see Figure 2) includes a radio-frequency (RF)-to-baseband down-converter chain and an intermediate- frequency (IF)-to-RF up-converter chain. Transmit/receive (T/R) switches enable the use of a single feed to the antenna for both transmission and reception. The T/R switches also afford a built-in test capability by enabling injection of a calibration signal into the receiver chain. In order of decreasing priority, components of the transceiver were selected according to requirements of radiation hardness, then compactness, then low power. All of the RF components are radiation-hard. The noise figure (NF) was optimized to the extent that (1) a low-noise amplifier (LNA) (characterized by NF < 2 dB) was selected but (2) the receiver front-end T/R switches were selected for a high degree of isolation and acceptably low loss, regardless of the requirement to minimize noise.
Some Calculations for the RHIC Kicker
DOE Office of Scientific and Technical Information (OSTI.GOV)
Claus, J.
1996-12-01
The bunches that arrive from the AGS are put on to RHIC's median plane by a string of four injection kickers in each ring. There are four short kickers rather than one long one in order to keep the kicker filling time acceptable, filling time being defined as the amount of time needed for increasing the deflecting field in the kicker from zero to its nominal value. During the filling time process the energy stored in the deflecting field is moved from outside the kicker to its aperture; since energy can only be displaced with finite velocity the filling timemore » is non-zero for kickers of non-zero length, and tends to increase with increasing length. It is one of the more important parameters of the kicker because it sets a lower limit to the time interval between the last of the already circulating bunches and the newly injected one, and thus an upper limit to the total number of bunches that can be injected. RF gymnastics can be used to pack the bunches tighter than is indicated by this limit, but such gymnastics required radial aperture beyond what would be required otherwise, as well as time, and probably special hardware. Minimization of the kicker's stored energy requires minimization of its aperture, it presents therefore a major aperture restriction. Unless it is placed at a point where the dispersion is negligible its aperture would have to be increased in order to provide the radial space needed for the gymnastics. Both the amount of extra space needed and the rate of longitudinal displacement increase with the maximum deviation in energy of the bunch to be displaced from the nominal value, thus taking more time for the exercise reduces the aperture requirements. This time is measured in terms of synchrotron periods and is not small. It adds directly to the filling time of each ring and decreases therefore the time-average luminosity. Evidently the maximation of the time-average luminosity is a complex issue in which the kicker filling time is a major parameter.« less
Todd E. Ristau; Susan L. Stout
2014-01-01
Assessment of regeneration can be time-consuming and costly. Often, foresters look for ways to minimize the cost of doing inventories. One potential method to reduce time required on a plot is use of percent cover data rather than seedling count data to determine stocking. Robust linear regression analysis was used in this report to predict seedling count data from...
Hartwig, E; Schultheiss, M; Bischoff, M
2002-08-01
Some 30% of unstable vertebral fractures of the thoracic and lumbar spine involve a destruction of the ventral column and thus of the supporting structures of the spine. This requires extensive surgical reconstruction procedures, which are carried out using minimally invasive techniques. The disadvantages of the minimally invasive methods are the high cost, the technical equipment and the expenditure of time required in the initial phase for the performance of the surgical procedure. With the structural reform of the health care system in the year 2000, the private-sector regulatory bodies were called upon to introduce a flat-rate compensation system for hospital services according to section 17b of the Hospital Law (KHG). The previous financing system which involved per-diem operating cost rates has thus been abolished. Calculations of individual entities are now required. Considering the case values to date, a contribution margin deficit of EUR 4628.45 has been calculated for our patients with fractures of the thoracic and lumbar spine without neurological defunctionalization symptoms. An economically efficient medical care is thus no longer possible. Consequently, an adjustment of the German relative weights must urgently be demanded in order to guarantee a high-quality medical care of patients.
Duerr, Adam E.; Miller, Tricia A.; Lanzone, Michael; Brandes, Dave; Cooper, Jeff; O'Malley, Kieran; Maisonneuve, Charles; Tremblay, Junior; Katzner, Todd
2012-01-01
To maximize fitness, flying animals should maximize flight speed while minimizing energetic expenditure. Soaring speeds of large-bodied birds are determined by flight routes and tradeoffs between minimizing time and energetic costs. Large raptors migrating in eastern North America predominantly glide between thermals that provide lift or soar along slopes or ridgelines using orographic lift (slope soaring). It is usually assumed that slope soaring is faster than thermal gliding because forward progress is constant compared to interrupted progress when birds pause to regain altitude in thermals. We tested this slope-soaring hypothesis using high-frequency GPS-GSM telemetry devices to track golden eagles during northbound migration. In contrast to expectations, flight speed was slower when slope soaring and eagles also were diverted from their migratory path, incurring possible energetic costs and reducing speed of progress towards a migratory endpoint. When gliding between thermals, eagles stayed on track and fast gliding speeds compensated for lack of progress during thermal soaring. When thermals were not available, eagles minimized migration time, not energy, by choosing energetically expensive slope soaring instead of waiting for thermals to develop. Sites suited to slope soaring include ridges preferred for wind-energy generation, thus avian risk of collision with wind turbines is associated with evolutionary trade-offs required to maximize fitness of time-minimizing migratory raptors. PMID:22558166
Duerr, Adam E; Miller, Tricia A; Lanzone, Michael; Brandes, Dave; Cooper, Jeff; O'Malley, Kieran; Maisonneuve, Charles; Tremblay, Junior; Katzner, Todd
2012-01-01
To maximize fitness, flying animals should maximize flight speed while minimizing energetic expenditure. Soaring speeds of large-bodied birds are determined by flight routes and tradeoffs between minimizing time and energetic costs. Large raptors migrating in eastern North America predominantly glide between thermals that provide lift or soar along slopes or ridgelines using orographic lift (slope soaring). It is usually assumed that slope soaring is faster than thermal gliding because forward progress is constant compared to interrupted progress when birds pause to regain altitude in thermals. We tested this slope-soaring hypothesis using high-frequency GPS-GSM telemetry devices to track golden eagles during northbound migration. In contrast to expectations, flight speed was slower when slope soaring and eagles also were diverted from their migratory path, incurring possible energetic costs and reducing speed of progress towards a migratory endpoint. When gliding between thermals, eagles stayed on track and fast gliding speeds compensated for lack of progress during thermal soaring. When thermals were not available, eagles minimized migration time, not energy, by choosing energetically expensive slope soaring instead of waiting for thermals to develop. Sites suited to slope soaring include ridges preferred for wind-energy generation, thus avian risk of collision with wind turbines is associated with evolutionary trade-offs required to maximize fitness of time-minimizing migratory raptors.
Bryce, Thomas N.; Dijkers, Marcel P.
2015-01-01
Background: Powered exoskeletons have been demonstrated as being safe for persons with spinal cord injury (SCI), but little is known about how users learn to manage these devices. Objective: To quantify the time and effort required by persons with SCI to learn to use an exoskeleton for assisted walking. Methods: A convenience sample was enrolled to learn to use the first-generation Ekso powered exoskeleton to walk. Participants were given up to 24 weekly sessions of instruction. Data were collected on assistance level, walking distance and speed, heart rate, perceived exertion, and adverse events. Time and effort was quantified by the number of sessions required for participants to stand up, walk for 30 minutes, and sit down, initially with minimal and subsequently with contact guard assistance. Results: Of 22 enrolled participants, 9 screen-failed, and 7 had complete data. All of these 7 were men; 2 had tetraplegia and 5 had motor-complete injuries. Of these, 5 participants could stand, walk, and sit with contact guard or close supervision assistance, and 2 required minimal to moderate assistance. Walk times ranged from 28 to 94 minutes with average speeds ranging from 0.11 to 0.21 m/s. For all participants, heart rate changes and reported perceived exertion were consistent with light to moderate exercise. Conclusion: This study provides preliminary evidence that persons with neurological weakness due to SCI can learn to walk with little or no assistance and light to somewhat hard perceived exertion using a powered exoskeleton. Persons with different severities of injury, including those with motor complete C7 tetraplegia and motor incomplete C4 tetraplegia, may be able to learn to use this device. PMID:26364280
Kozlowski, Allan J; Bryce, Thomas N; Dijkers, Marcel P
2015-01-01
Powered exoskeletons have been demonstrated as being safe for persons with spinal cord injury (SCI), but little is known about how users learn to manage these devices. To quantify the time and effort required by persons with SCI to learn to use an exoskeleton for assisted walking. A convenience sample was enrolled to learn to use the first-generation Ekso powered exoskeleton to walk. Participants were given up to 24 weekly sessions of instruction. Data were collected on assistance level, walking distance and speed, heart rate, perceived exertion, and adverse events. Time and effort was quantified by the number of sessions required for participants to stand up, walk for 30 minutes, and sit down, initially with minimal and subsequently with contact guard assistance. Of 22 enrolled participants, 9 screen-failed, and 7 had complete data. All of these 7 were men; 2 had tetraplegia and 5 had motor-complete injuries. Of these, 5 participants could stand, walk, and sit with contact guard or close supervision assistance, and 2 required minimal to moderate assistance. Walk times ranged from 28 to 94 minutes with average speeds ranging from 0.11 to 0.21 m/s. For all participants, heart rate changes and reported perceived exertion were consistent with light to moderate exercise. This study provides preliminary evidence that persons with neurological weakness due to SCI can learn to walk with little or no assistance and light to somewhat hard perceived exertion using a powered exoskeleton. Persons with different severities of injury, including those with motor complete C7 tetraplegia and motor incomplete C4 tetraplegia, may be able to learn to use this device.
Torczynski, John R.
2000-01-01
A spin coating apparatus requires less cleanroom air flow than prior spin coating apparatus to minimize cleanroom contamination. A shaped exhaust duct from the spin coater maintains process quality while requiring reduced cleanroom air flow. The exhaust duct can decrease in cross section as it extends from the wafer, minimizing eddy formation. The exhaust duct can conform to entrainment streamlines to minimize eddy formation and reduce interprocess contamination at minimal cleanroom air flow rates.
Decazes, J M; Ernst, J D; Sande, M A
1983-01-01
Ceftriaxone was highly active in eliminating Escherichia coli from the cerebrospinal fluid of rabbits infected with experimental meningitis. However, concentrations equal to or greater than 10 times the minimal bactericidal concentration had to be achieved to ensure optimal efficacy (rate of kill, 1.5 log10 CFU/ml per h). In contrast to other beta-lactams studied in this model, ceftriaxone concentrations in cerebrospinal fluid progressively increased, whereas serum steady state was obtained by constant infusion. The percent penetration was 2.1% after 1 h of therapy, in contrast to 8.9% after 7 h (P less than 0.001). In vitro time-kill curves done in cerebrospinal fluid or broth more closely predicted the drug concentrations required for a maximum cidal effect in vivo than that predicted by determinations of minimal inhibitory or bactericidal concentrations. PMID:6316841
Design and architecture of the Mars relay network planning and analysis framework
NASA Technical Reports Server (NTRS)
Cheung, K. M.; Lee, C. H.
2002-01-01
In this paper we describe the design and architecture of the Mars Network planning and analysis framework that supports generation and validation of efficient planning and scheduling strategy. The goals are to minimize the transmitting time, minimize the delaying time, and/or maximize the network throughputs. The proposed framework would require (1) a client-server architecture to support interactive, batch, WEB, and distributed analysis and planning applications for the relay network analysis scheme, (2) a high-fidelity modeling and simulation environment that expresses link capabilities between spacecraft to spacecraft and spacecraft to Earth stations as time-varying resources, and spacecraft activities, link priority, Solar System dynamic events, the laws of orbital mechanics, and other limiting factors as spacecraft power and thermal constraints, (3) an optimization methodology that casts the resource and constraint models into a standard linear and nonlinear constrained optimization problem that lends itself to commercial off-the-shelf (COTS)planning and scheduling algorithms.
Intelligent Sampling of Hazardous Particle Populations in Resource-Constrained Environments
NASA Astrophysics Data System (ADS)
McCollough, J. P.; Quinn, J. M.; Starks, M. J.; Johnston, W. R.
2017-10-01
Sampling of anomaly-causing space environment drivers is necessary for both real-time operations and satellite design efforts, and optimizing measurement sampling helps minimize resource demands. Relating these measurements to spacecraft anomalies requires the ability to resolve spatial and temporal variability in the energetic charged particle hazard of interest. Here we describe a method for sampling particle fluxes informed by magnetospheric phenomenology so that, along a given trajectory, the variations from both temporal dynamics and spatial structure are adequately captured while minimizing oversampling. We describe the coordinates, sampling method, and specific regions and parameters employed. We compare resulting sampling cadences with data from spacecraft spanning the regions of interest during a geomagnetically active period, showing that the algorithm retains the gross features necessary to characterize environmental impacts on space systems in diverse orbital regimes while greatly reducing the amount of sampling required. This enables sufficient environmental specification within a resource-constrained context, such as limited telemetry bandwidth, processing requirements, and timeliness.
Optimization Methods in Sherpa
NASA Astrophysics Data System (ADS)
Siemiginowska, Aneta; Nguyen, Dan T.; Doe, Stephen M.; Refsdal, Brian L.
2009-09-01
Forward fitting is a standard technique used to model X-ray data. A statistic, usually assumed weighted chi^2 or Poisson likelihood (e.g. Cash), is minimized in the fitting process to obtain a set of the best model parameters. Astronomical models often have complex forms with many parameters that can be correlated (e.g. an absorbed power law). Minimization is not trivial in such setting, as the statistical parameter space becomes multimodal and finding the global minimum is hard. Standard minimization algorithms can be found in many libraries of scientific functions, but they are usually focused on specific functions. However, Sherpa designed as general fitting and modeling application requires very robust optimization methods that can be applied to variety of astronomical data (X-ray spectra, images, timing, optical data etc.). We developed several optimization algorithms in Sherpa targeting a wide range of minimization problems. Two local minimization methods were built: Levenberg-Marquardt algorithm was obtained from MINPACK subroutine LMDIF and modified to achieve the required robustness; and Nelder-Mead simplex method has been implemented in-house based on variations of the algorithm described in the literature. A global search Monte-Carlo method has been implemented following a differential evolution algorithm presented by Storn and Price (1997). We will present the methods in Sherpa and discuss their usage cases. We will focus on the application to Chandra data showing both 1D and 2D examples. This work is supported by NASA contract NAS8-03060 (CXC).
Nursing home medication administration cost minimization analysis.
Hamrick, Irene; Nye, Ann Marie; Gardner, Casey K
2007-03-01
To assess the time it takes nurses to administer medications in the nursing home setting, to calculate nursing cost of medication administration, and to determine whether using extended-release products are justified by decreasing nursing costs. Cost-minimization analysis using observational data from a time-motion analysis. Two 150-bed nursing homes in rural eastern North Carolina. Nurses working during first and second shifts. Nurses were timed as they each administered medications to 12 patients. The mean time required to administer each dosage form was calculated. The cost of nursing time was based on the average nursing staff salary of $20.45 per hour as reported by the directors of nursing. Time and cost to dispense one more medication during an existing medication pass and an additional medication pass are calculated. The time to administer an additional dose of an oral medication to one patient was 45.01 seconds during an already scheduled medication pass and 63.05 seconds during a new medication pass. The cost of adding an oral medication once a day for a patient will cost $7.67 per month if administered at the same time as other medications or $10.74 per month if a new medication pass is required. The administration of other dosage forms, such as crushed, percutaneous enteroscopic gastrostomy, injection, and patch was more time involved and, thus, costlier. Formulas are provided to calculate medication administration cost based on local salary. Nursing time and costs for medication administration in the nursing home are great and should be considered when selecting a product. This may justify the selection of higher cost extended-release products.
Luo, He; Liang, Zhengzheng; Zhu, Moning; Hu, Xiaoxuan; Wang, Guoqiang
2018-01-01
Wind has a significant effect on the control of fixed-wing unmanned aerial vehicles (UAVs), resulting in changes in their ground speed and direction, which has an important influence on the results of integrated optimization of UAV task allocation and path planning. The objective of this integrated optimization problem changes from minimizing flight distance to minimizing flight time. In this study, the Euclidean distance between any two targets is expanded to the Dubins path length, considering the minimum turning radius of fixed-wing UAVs. According to the vector relationship between wind speed, UAV airspeed, and UAV ground speed, a method is proposed to calculate the flight time of UAV between targets. On this basis, a variable-speed Dubins path vehicle routing problem (VS-DP-VRP) model is established with the purpose of minimizing the time required for UAVs to visit all the targets and return to the starting point. By designing a crossover operator and mutation operator, the genetic algorithm is used to solve the model, the results of which show that an effective UAV task allocation and path planning solution under steady wind can be provided.
Liang, Zhengzheng; Zhu, Moning; Hu, Xiaoxuan; Wang, Guoqiang
2018-01-01
Wind has a significant effect on the control of fixed-wing unmanned aerial vehicles (UAVs), resulting in changes in their ground speed and direction, which has an important influence on the results of integrated optimization of UAV task allocation and path planning. The objective of this integrated optimization problem changes from minimizing flight distance to minimizing flight time. In this study, the Euclidean distance between any two targets is expanded to the Dubins path length, considering the minimum turning radius of fixed-wing UAVs. According to the vector relationship between wind speed, UAV airspeed, and UAV ground speed, a method is proposed to calculate the flight time of UAV between targets. On this basis, a variable-speed Dubins path vehicle routing problem (VS-DP-VRP) model is established with the purpose of minimizing the time required for UAVs to visit all the targets and return to the starting point. By designing a crossover operator and mutation operator, the genetic algorithm is used to solve the model, the results of which show that an effective UAV task allocation and path planning solution under steady wind can be provided. PMID:29561888
During the course of our research, we expanded and evolved our initial concept to achieve design targets of minimized cost of electricity. Several biofuel pathways were examined, and each had drawbacks in terms of cost (2-3 times market rates for energy), land area required (5 to...
76 FR 79141 - List of Rules To Be Reviewed Pursuant to the Regulatory Flexibility Act
Federal Register 2010, 2011, 2012, 2013, 2014
2011-12-21
... initial notice within a reasonable time after establishing a customer relationship in two additional..., requires an agency to review its rules that have a significant economic impact upon a substantial number of... be amended or rescinded * * * to minimize any significant economic impact of the rules upon a...
Your College Degree: The External Degree Way.
ERIC Educational Resources Information Center
Haponski, William C.; And Others
Information on undertaking an external degree program to obtain a college education is presented. An external degree program is one that has no, or minimal requirements for residence (on-campus attendance). Most often it can be entered at any time of the year and usually grants credit for documented learning already acquired. An external degree…
Earned and Unearned Degrees, Earned and Unearned Teaching Certificates: Implications for Education.
ERIC Educational Resources Information Center
Shaughnessy, Michael F.; Gaedke, Billy
This article discusses the impact of instructional television, directed study courses, and other alternative teacher certification methods. Colleges and universities are becoming aware of nontraditional programs that require minimal, if any, time on campus or direct contact with instructors. Soon, there will be a proliferation of Internet courses.…
Stochastic Frontier Estimation of Efficient Learning in Video Games
ERIC Educational Resources Information Center
Hamlen, Karla R.
2012-01-01
Stochastic Frontier Regression Analysis was used to investigate strategies and skills that are associated with the minimization of time required to achieve proficiency in video games among students in grades four and five. Students self-reported their video game play habits, including strategies and skills used to become good at the video games…
NASA Technical Reports Server (NTRS)
1971-01-01
Developed methodologies and procedures for the reduction of microbial burden on an assembled spacecraft at the time of encapsulation or terminal sterilization are reported. This technology is required for reducing excessive microbial burden on spacecraft components for the purposes of either decreasing planetary contamination probabilities for an orbiter or minimizing the duration of a sterilization process for a lander.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-05-03
... can be provided in the desired format, reporting burden (time and financial resources) is minimized, collection instruments are clearly understood, and the impact of collection requirements on respondents can... collection request can be obtained by contacting the office listed below in the ADDRESSES section of this...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-05-03
... can be provided in the desired format, reporting burden (time and financial resources) is minimized, collection instruments are clearly understood, and the impact of collection requirements on respondents can... Employment History (CM- 911A). A copy of the proposed information collection request can be obtained by...
Grading Homework to Emphasize Problem-Solving Process Skills
ERIC Educational Resources Information Center
Harper, Kathleen A.
2012-01-01
This article describes a grading approach that encourages students to employ particular problem-solving skills. Some strengths of this method, called "process-based grading," are that it is easy to implement, requires minimal time to grade, and can be used in conjunction with either an online homework delivery system or paper-based homework.
40 CFR 60.4333 - What are my general requirements for complying with this subpart?
Code of Federal Regulations, 2011 CFR
2011-07-01
... consistent with good air pollution control practices for minimizing emissions at all times including during startup, shutdown, and malfunction. (b) When an affected unit with heat recovery utilizes a common steam... the other unit(s) utilizing the common heat recovery unit; or (2) Develop, demonstrate, and provide...
40 CFR 60.4333 - What are my general requirements for complying with this subpart?
Code of Federal Regulations, 2010 CFR
2010-07-01
... consistent with good air pollution control practices for minimizing emissions at all times including during startup, shutdown, and malfunction. (b) When an affected unit with heat recovery utilizes a common steam... the other unit(s) utilizing the common heat recovery unit; or (2) Develop, demonstrate, and provide...
The Efficiency of Public Spending on Education: An Empirical Comparison of EU Countries
ERIC Educational Resources Information Center
Agasisti, Tommaso
2014-01-01
Recent policy suggestions from the European Community underlined the importance of "efficiency" and "equity" in the provision of education while, at the same time, the European countries are required to provide their educational services by minimizing the amount of public money devoted to them. In this article, an empirical…
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-04
... Cost (Operation and Maintenance): $0. IV. Public Participation--Submission of Comments on This Notice... and costs) is minimal, collection instruments are clearly understood, and OSHA's estimate of the... information is useful; The accuracy of OSHA's estimate of the burden (time and costs) of the information...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-10-20
.... Estimated Cost (Operation and Maintenance): $54,197 IV. Public Participation--Submission of Comments on This... costs) is minimal, collection instruments are clearly understood, and OSHA's estimate of the information... accuracy of OSHA's estimate of the burden (time and costs) of the information collection requirements...
A technique for thermal desorption analyses suitable for thermally-labile, volatile compounds
USDA-ARS?s Scientific Manuscript database
Our group has for some time studied below ground plant produced volatile signals affecting nematode and insect behavior. The research requires repeated sampling of intact plant/soil systems in the lab as well as the field with the help of probes to minimize unwanted effects on the systems we are stu...
Introducing and Developing Map Skills with Persons Having Mild or Moderate Learning Difficulties.
ERIC Educational Resources Information Center
Renfrew, Tom
1997-01-01
A British project found that appropriate training in map skills enabled children and adults with mild mental retardation to complete a white color-coded orienteering course with minimal assistance but that persons with moderate mental retardation required more assistance and instruction time to complete course objectives. Describes approaches to…
ERIC Educational Resources Information Center
Haynes, John; Miller, Judith
2015-01-01
Background: Pre-service teacher education (PSTE) programmes for generalist primary school teachers have limited time allocated to Physical Education, Health and Personal Development. In practice, teachers in schools are required to assess motor skills despite the fact that their training provides minimal preparation. This necessitates creative…
Morrato, Elaine H; Smith, Meredith Y
2015-01-01
Pharmaceutical risk minimization programs are now an established requirement in the regulatory landscape. However, pharmaceutical companies have been slow to recognize and embrace the significant potential these programs offer in terms of enhancing trust with health care professionals and patients, and for providing a mechanism for bringing products to the market that might not otherwise have been approved. Pitfalls of the current drug development process include risk minimization programs that are not data driven; missed opportunities to incorporate pragmatic methods and market-based insights, outmoded tools and data sources, lack of rapid evaluative learning to support timely adaption, lack of systematic approaches for patient engagement, and questions on staffing and organizational infrastructure. We propose better integration of risk minimization with clinical drug development and commercialization work streams throughout the product lifecycle. We articulate a vision and propose broad adoption of organizational models for incorporating risk minimization expertise into the drug development process. Three organizational models are discussed and compared: outsource/external vendor, embedded risk management specialist model, and Center of Excellence. PMID:25750537
NASA Astrophysics Data System (ADS)
Kim, Ji-Su; Park, Jung-Hyeon; Lee, Dong-Ho
2017-10-01
This study addresses a variant of job-shop scheduling in which jobs are grouped into job families, but they are processed individually. The problem can be found in various industrial systems, especially in reprocessing shops of remanufacturing systems. If the reprocessing shop is a job-shop type and has the component-matching requirements, it can be regarded as a job shop with job families since the components of a product constitute a job family. In particular, sequence-dependent set-ups in which set-up time depends on the job just completed and the next job to be processed are also considered. The objective is to minimize the total family flow time, i.e. the maximum among the completion times of the jobs within a job family. A mixed-integer programming model is developed and two iterated greedy algorithms with different local search methods are proposed. Computational experiments were conducted on modified benchmark instances and the results are reported.
Immobilization techniques to avoid enzyme loss from oxidase-based biosensors: a one-year study.
House, Jody L; Anderson, Ellen M; Ward, W Kenneth
2007-01-01
Continuous amperometric sensors that measure glucose or lactate require a stable sensitivity, and glutaraldehyde crosslinking has been used widely to avoid enzyme loss. Nonetheless, little data is published on the effectiveness of enzyme immobilization with glutaraldehyde. A combination of electrochemical testing and spectrophotometric assays was used to study the relationship between enzyme shedding and the fabrication procedure. In addition, we studied the relationship between the glutaraldehyde concentration and sensor performance over a period of one year. The enzyme immobilization process by glutaraldehyde crosslinking to glucose oxidase appears to require at least 24-hours at room temperature to reach completion. In addition, excess free glucose oxidase can be removed by soaking sensors in purified water for 20 minutes. Even with the addition of these steps, however, it appears that there is some free glucose oxidase entrapped within the enzyme layer which contributes to a decline in sensitivity over time. Although it reduces the ultimate sensitivity (probably via a change in the enzyme's natural conformation), glutaraldehyde concentration in the enzyme layer can be increased in order to minimize this instability. After exposure of oxidase enzymes to glutaraldehyde, effective crosslinking requires a rinse step and a 24-hour incubation step. In order to minimize the loss of sensor sensitivity over time, the glutaraldehyde concentration can be increased.
Lower Limits on Aperture Size for an ExoEarth Detecting Coronagraphic Mission
NASA Technical Reports Server (NTRS)
Stark, Christopher C.; Roberge, Aki; Mandell, Avi; Clampin, Mark; Domagal-Goldman, Shawn D.; McElwain, Michael W.; Stapelfeldt, Karl R.
2015-01-01
The yield of Earth-like planets will likely be a primary science metric for future space-based missions that will drive telescope aperture size. Maximizing the exoEarth candidate yield is therefore critical to minimizing the required aperture. Here we describe a method for exoEarth candidate yield maximization that simultaneously optimizes, for the first time, the targets chosen for observation, the number of visits to each target, the delay time between visits, and the exposure time of every observation. This code calculates both the detection time and multiwavelength spectral characterization time required for planets. We also refine the astrophysical assumptions used as inputs to these calculations, relying on published estimates of planetary occurrence rates as well as theoretical and observational constraints on terrestrial planet sizes and classical habitable zones. Given these astrophysical assumptions, optimistic telescope and instrument assumptions, and our new completeness code that produces the highest yields to date, we suggest lower limits on the aperture size required to detect and characterize a statistically motivated sample of exoEarths.
Supportability Technologies for Future Exploration Missions
NASA Technical Reports Server (NTRS)
Watson, Kevin; Thompson, Karen
2007-01-01
Future long-duration human exploration missions will be challenged by resupply limitations and mass and volume constraints. Consequently, it will be essential that the logistics footprint required to support these missions be minimized and that capabilities be provided to make them highly autonomous from a logistics perspective. Strategies to achieve these objectives include broad implementation of commonality and standardization at all hardware levels and across all systems, repair of failed hardware at the lowest possible hardware level, and manufacture of structural and mechanical replacement components as needed. Repair at the lowest hardware levels will require the availability of compact, portable systems for diagnosis of failures in electronic systems and verification of system functionality following repair. Rework systems will be required that enable the removal and replacement of microelectronic components with minimal human intervention to minimize skill requirements and training demand for crews. Materials used in the assembly of electronic systems (e.g. solders, fluxes, conformal coatings) must be compatible with the available repair methods and the spacecraft environment. Manufacturing of replacement parts for structural and mechanical applications will require additive manufacturing systems that can generate near-net-shape parts from the range of engineering alloys employed in the spacecraft structure and in the parts utilized in other surface systems. These additive manufacturing processes will need to be supported by real-time non-destructive evaluation during layer-additive processing for on-the-fly quality control. This will provide capabilities for quality control and may serve as an input for closed-loop process control. Additionally, non-destructive methods should be available for material property determination. These nondestructive evaluation processes should be incorporated with the additive manufacturing process - providing an in-process capability to ensure that material deposited during layer-additive processing meets required material property criteria.
QRTEngine: An easy solution for running online reaction time experiments using Qualtrics.
Barnhoorn, Jonathan S; Haasnoot, Erwin; Bocanegra, Bruno R; van Steenbergen, Henk
2015-12-01
Performing online behavioral research is gaining increased popularity among researchers in psychological and cognitive science. However, the currently available methods for conducting online reaction time experiments are often complicated and typically require advanced technical skills. In this article, we introduce the Qualtrics Reaction Time Engine (QRTEngine), an open-source JavaScript engine that can be embedded in the online survey development environment Qualtrics. The QRTEngine can be used to easily develop browser-based online reaction time experiments with accurate timing within current browser capabilities, and it requires only minimal programming skills. After introducing the QRTEngine, we briefly discuss how to create and distribute a Stroop task. Next, we describe a study in which we investigated the timing accuracy of the engine under different processor loads using external chronometry. Finally, we show that the QRTEngine can be used to reproduce classic behavioral effects in three reaction time paradigms: a Stroop task, an attentional blink task, and a masked-priming task. These findings demonstrate that QRTEngine can be used as a tool for conducting online behavioral research even when this requires accurate stimulus presentation times.
Optimal trajectories of aircraft and spacecraft
NASA Technical Reports Server (NTRS)
Miele, A.
1990-01-01
Work done on algorithms for the numerical solutions of optimal control problems and their application to the computation of optimal flight trajectories of aircraft and spacecraft is summarized. General considerations on calculus of variations, optimal control, numerical algorithms, and applications of these algorithms to real-world problems are presented. The sequential gradient-restoration algorithm (SGRA) is examined for the numerical solution of optimal control problems of the Bolza type. Both the primal formulation and the dual formulation are discussed. Aircraft trajectories, in particular, the application of the dual sequential gradient-restoration algorithm (DSGRA) to the determination of optimal flight trajectories in the presence of windshear are described. Both take-off trajectories and abort landing trajectories are discussed. Take-off trajectories are optimized by minimizing the peak deviation of the absolute path inclination from a reference value. Abort landing trajectories are optimized by minimizing the peak drop of altitude from a reference value. Abort landing trajectories are optimized by minimizing the peak drop of altitude from a reference value. The survival capability of an aircraft in a severe windshear is discussed, and the optimal trajectories are found to be superior to both constant pitch trajectories and maximum angle of attack trajectories. Spacecraft trajectories, in particular, the application of the primal sequential gradient-restoration algorithm (PSGRA) to the determination of optimal flight trajectories for aeroassisted orbital transfer are examined. Both the coplanar case and the noncoplanar case are discussed within the frame of three problems: minimization of the total characteristic velocity; minimization of the time integral of the square of the path inclination; and minimization of the peak heating rate. The solution of the second problem is called nearly-grazing solution, and its merits are pointed out as a useful engineering compromise between energy requirements and aerodynamics heating requirements.
Protein electron transfer: Dynamics and statistics
NASA Astrophysics Data System (ADS)
Matyushov, Dmitry V.
2013-07-01
Electron transfer between redox proteins participating in energy chains of biology is required to proceed with high energetic efficiency, minimizing losses of redox energy to heat. Within the standard models of electron transfer, this requirement, combined with the need for unidirectional (preferably activationless) transitions, is translated into the need to minimize the reorganization energy of electron transfer. This design program is, however, unrealistic for proteins whose active sites are typically positioned close to the polar and flexible protein-water interface to allow inter-protein electron tunneling. The high flexibility of the interfacial region makes both the hydration water and the surface protein layer act as highly polar solvents. The reorganization energy, as measured by fluctuations, is not minimized, but rather maximized in this region. Natural systems in fact utilize the broad breadth of interfacial electrostatic fluctuations, but in the ways not anticipated by the standard models based on equilibrium thermodynamics. The combination of the broad spectrum of static fluctuations with their dispersive dynamics offers the mechanism of dynamical freezing (ergodicity breaking) of subsets of nuclear modes on the time of reaction/residence of the electron at a redox cofactor. The separation of time-scales of nuclear modes coupled to electron transfer allows dynamical freezing. In particular, the separation between the relaxation time of electro-elastic fluctuations of the interface and the time of conformational transitions of the protein caused by changing redox state results in dynamical freezing of the latter for sufficiently fast electron transfer. The observable consequence of this dynamical freezing is significantly different reorganization energies describing the curvature at the bottom of electron-transfer free energy surfaces (large) and the distance between their minima (Stokes shift, small). The ratio of the two reorganization energies establishes the parameter by which the energetic efficiency of protein electron transfer is increased relative to the standard expectations, thus minimizing losses of energy to heat. Energetically efficient electron transfer occurs in a chain of conformationally quenched cofactors and is characterized by flattened free energy surfaces, reminiscent of the flat and rugged landscape at the stability basin of a folded protein.
Protein electron transfer: Dynamics and statistics.
Matyushov, Dmitry V
2013-07-14
Electron transfer between redox proteins participating in energy chains of biology is required to proceed with high energetic efficiency, minimizing losses of redox energy to heat. Within the standard models of electron transfer, this requirement, combined with the need for unidirectional (preferably activationless) transitions, is translated into the need to minimize the reorganization energy of electron transfer. This design program is, however, unrealistic for proteins whose active sites are typically positioned close to the polar and flexible protein-water interface to allow inter-protein electron tunneling. The high flexibility of the interfacial region makes both the hydration water and the surface protein layer act as highly polar solvents. The reorganization energy, as measured by fluctuations, is not minimized, but rather maximized in this region. Natural systems in fact utilize the broad breadth of interfacial electrostatic fluctuations, but in the ways not anticipated by the standard models based on equilibrium thermodynamics. The combination of the broad spectrum of static fluctuations with their dispersive dynamics offers the mechanism of dynamical freezing (ergodicity breaking) of subsets of nuclear modes on the time of reaction/residence of the electron at a redox cofactor. The separation of time-scales of nuclear modes coupled to electron transfer allows dynamical freezing. In particular, the separation between the relaxation time of electro-elastic fluctuations of the interface and the time of conformational transitions of the protein caused by changing redox state results in dynamical freezing of the latter for sufficiently fast electron transfer. The observable consequence of this dynamical freezing is significantly different reorganization energies describing the curvature at the bottom of electron-transfer free energy surfaces (large) and the distance between their minima (Stokes shift, small). The ratio of the two reorganization energies establishes the parameter by which the energetic efficiency of protein electron transfer is increased relative to the standard expectations, thus minimizing losses of energy to heat. Energetically efficient electron transfer occurs in a chain of conformationally quenched cofactors and is characterized by flattened free energy surfaces, reminiscent of the flat and rugged landscape at the stability basin of a folded protein.
NASA Astrophysics Data System (ADS)
Özcan, Abdullah; Rivière-Lorphèvre, Edouard; Ducobu, François
2018-05-01
In part manufacturing, efficient process should minimize the cycle time needed to reach the prescribed quality on the part. In order to optimize it, the machining time needs to be as low as possible and the quality needs to meet some requirements. For a 2D milling toolpath defined by sharp corners, the programmed feedrate is different from the reachable feedrate due to kinematic limits of the motor drives. This phenomena leads to a loss of productivity. Smoothing the toolpath allows to reduce significantly the machining time but the dimensional accuracy should not be neglected. Therefore, a way to address the problem of optimizing a toolpath in part manufacturing is to take into account the manufacturing time and the part quality. On one hand, maximizing the feedrate will minimize the manufacturing time and, on the other hand, the maximum of the contour error needs to be set under a threshold to meet the quality requirements. This paper presents a method to optimize sharp corner smoothing using b-spline curves by adjusting the control points defining the curve. The objective function used in the optimization process is based on the contour error and the difference between the programmed feedrate and an estimation of the reachable feedrate. The estimation of the reachable feedrate is based on geometrical information. Some simulation results are presented in the paper and the machining times are compared in each cases.
The Minimal Cost of Life in Space
NASA Astrophysics Data System (ADS)
Drysdale, A.; Rutkze, C.; Albright, L.; Ladue, R.
Life in space requires protection from the external environment, provision of a suitable internal environment, provision of consumables to maintain life, and removal of wastes. Protection from the external environment will mainly require shielding from radiation and meteoroids. Provision of a suitable environment inside the spacecraft will require provision of suitable air pressure and composition, temperature, and protection from environmental toxins (trace contaminants) and pathogenic micro-organisms. Gravity may be needed for longer missions to avoid excessive changes such as decalcification and muscle degeneration. Similarly, the volume required per crewmember will increase as the mission duration increases. Consumables required include oxygen, food, and water. Nitrogen might be required, depending on the total pressure and non-metabolic losses. We normally provide these consumables from the Earth, with a greater or lesser degree of regeneration. In principle, all consumables can be regenerated. Water and air are easiest to regenerate. At the present time, food can only be regenerated by using plants, and higher plants at that. Waste must be removed, including carbon dioxide and other metabolic waste as well as trash such as food packaging, filters, and expended spare parts. This can be done by dumping or regeneration. The minimal cost of life in space would be to use a synthesis process or system to regenerate all consumables from wastes. As the efficiency of the various processes rises, the minimal cost of life support will fall. However, real world regeneration requires significant equipment, power, and crew time. Make-up will be required for those items that cannot be economically regenerated. For very inefficient processes, it might be cheaper to ship all or part of the consumables. We are currently far down the development curve, and for short missions it is cheaper to ship consumables. For longer duration missions, greater closure is cost effective. However, there will always be losses and inefficiencies. An example is provided using the Cornell Controlled Environment Agriculture experimental unit to define the real-world state of the art. This unit produces lettuce on a small but commercial scale. System inputs and outputs are well documented. Some changes can be identified that could be made to reduce the cost of such a system in space. Other changes would be speculative. The mission impact of such a plant production system is estimated using an equivalent system mass approach.
Shuttle's 160 hour ground turnaround - A design driver
NASA Technical Reports Server (NTRS)
Widick, F.
1977-01-01
Turnaround analysis added a new dimension to the Space Program with the advent of the Space Shuttle. The requirement to turn the flight hardware around in 160 working hours from landing to launch was a significant design driver and a useful tool in forcing the integration of flight and ground systems design to permit an efficient ground operation. Although there was concern that time constraints might increase program costs, the result of the analysis was to minimize facility requirements and simplify operations with resultant cost savings.
Method for Hot Real-Time Sampling of Gasification Products
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pomeroy, Marc D
The Thermochemical Process Development Unit (TCPDU) at the National Renewable Energy Laboratory (NREL) is a highly instrumented half-ton/day pilot scale plant capable of demonstrating industrially relevant thermochemical technologies from lignocellulosic biomass conversion, including gasification. Gasification creates primarily Syngas (a mixture of Hydrogen and Carbon Monoxide) that can be utilized with synthesis catalysts to form transportation fuels and other valuable chemicals. Biomass derived gasification products are a very complex mixture of chemical components that typically contain Sulfur and Nitrogen species that can act as catalysis poisons for tar reforming and synthesis catalysts. Real-time hot online sampling techniques, such as Molecular Beammore » Mass Spectrometry (MBMS), and Gas Chromatographs with Sulfur and Nitrogen specific detectors can provide real-time analysis providing operational indicators for performance. Sampling typically requires coated sampling lines to minimize trace sulfur interactions with steel surfaces. Other materials used inline have also shown conversion of sulfur species into new components and must be minimized. Sample line Residence time within the sampling lines must also be kept to a minimum to reduce further reaction chemistries. Solids from ash and char contribute to plugging and must be filtered at temperature. Experience at NREL has shown several key factors to consider when designing and installing an analytical sampling system for biomass gasification products. They include minimizing sampling distance, effective filtering as close to source as possible, proper line sizing, proper line materials or coatings, even heating of all components, minimizing pressure drops, and additional filtering or traps after pressure drops.« less
NASA Technical Reports Server (NTRS)
Herman, D. H.; Niehoff, J. C.; Spadoni, D. J.
1980-01-01
An approach is proposed for the structuring of a planetary mission set wherein the peak annual funding is minimized to meet the annual budget restraint. One aspect of the approach is to have a transportation capability that can launch a mission in any planetary opportunity; such capability can be provided by solar electric propulsion. Another cost reduction technique is to structure a mission test in a time sequenced fashion that could utilize essentially the same spacecraft for the implementation of several missions. A third technique would be to fulfill a scientific objective in several sequential missions rather than attempt to accomplish all of the objectives with one mission. The application of the approach is illustrated by an example involving the Solar Orbiter Dual Probe mission.
NASA Technical Reports Server (NTRS)
Yamaleev, N. K.; Diskin, B.; Nielsen, E. J.
2009-01-01
.We study local-in-time adjoint-based methods for minimization of ow matching functionals subject to the 2-D unsteady compressible Euler equations. The key idea of the local-in-time method is to construct a very accurate approximation of the global-in-time adjoint equations and the corresponding sensitivity derivative by using only local information available on each time subinterval. In contrast to conventional time-dependent adjoint-based optimization methods which require backward-in-time integration of the adjoint equations over the entire time interval, the local-in-time method solves local adjoint equations sequentially over each time subinterval. Since each subinterval contains relatively few time steps, the storage cost of the local-in-time method is much lower than that of the global adjoint formulation, thus making the time-dependent optimization feasible for practical applications. The paper presents a detailed comparison of the local- and global-in-time adjoint-based methods for minimization of a tracking functional governed by the Euler equations describing the ow around a circular bump. Our numerical results show that the local-in-time method converges to the same optimal solution obtained with the global counterpart, while drastically reducing the memory cost as compared to the global-in-time adjoint formulation.
Fastener Capture Plate Technology to Contain On-Orbit Debris
NASA Technical Reports Server (NTRS)
Eisenhower, Kevin
2010-01-01
The Fastener Capture Plate technology was developed to solve the problem of capturing loose hardware and small fasteners, items that were not originally intended to be disengaged in microgravity, thus preventing them from becoming space debris. This technology was incorporated into astronaut tools designed and successfully used on NASA s Hubble Space Telescope Servicing Mission #4. The technology s ultimate benefit is that it allows a very time-efficient method for disengaging fasteners and removing hardware while minimizing the chances of losing parts or generating debris. The technology aims to simplify the manual labor required of the operator. It does so by optimizing visibility and access to the work site and minimizing the operator's need to be concerned with debris while performing the operations. It has a range of unique features that were developed to minimize task time, as well as maximize the ease and confidence of the astronaut operator. This paper describes the technology and the astronaut tools developed specifically for a complicated on-orbit repair, and it includes photographs of the hardware being used in outer space.
Almeida, Murilo P.; Parteli, Eric J. R.; Andrade, José S.; Herrmann, Hans J.
2008-01-01
Saltation, the motion of sand grains in a sequence of ballistic trajectories close to the ground, is a major factor for surface erosion, dune formation, and triggering of dust storms on Mars. Although this mode of sand transport has been matter of research for decades through both simulations and wind tunnel experiments under Earth and Mars conditions, it has not been possible to provide accurate measurements of particle trajectories in fully developed turbulent flow. Here we calculate the motion of saltating grains by directly solving the turbulent wind field and its interaction with the particles. Our calculations show that the minimal wind velocity required to sustain saltation on Mars may be surprisingly lower than the aerodynamic minimal threshold measurable in wind tunnels. Indeed, Mars grains saltate in 100 times higher and longer trajectories and reach 5-10 times higher velocities than Earth grains do. On the basis of our results, we arrive at general expressions that can be applied to calculate the length and height of saltation trajectories and the flux of grains in saltation under various physical conditions, when the wind velocity is close to the minimal threshold for saltation. PMID:18443302
Initial Ares I Bending Filter Design
NASA Technical Reports Server (NTRS)
Jang, Jiann-Woei; Bedrossian, Nazareth; Hall, Robert; Norris, H. Lee; Hall, Charles; Jackson, Mark
2007-01-01
The Ares-I launch vehicle represents a challenging flex-body structural environment for control system design. Software filtering of the inertial sensor output will be required to ensure control system stability and adequate performance. This paper presents a design methodology employing numerical optimization to develop the Ares-I bending filters. The filter design methodology was based on a numerical constrained optimization approach to maximize stability margins while meeting performance requirements. The resulting bending filter designs achieved stability by adding lag to the first structural frequency and hence phase stabilizing the first Ares-I flex mode. To minimize rigid body performance impacts, a priority was placed via constraints in the optimization algorithm to minimize bandwidth decrease with the addition of the bending filters. The bending filters provided here have been demonstrated to provide a stable first stage control system in both the frequency domain and the MSFC MAVERIC time domain simulation.
A Concept for a Mobile Remote Manipulator System
NASA Technical Reports Server (NTRS)
Mikulus, M. M., Jr.; Bush, H. G.; Wallsom, R. E.; Jensen, J. K.
1985-01-01
A conceptual design for a Mobile Remote Manipulator System (MRMS) is presented. This concept does not require continuous rails for mobility (only guide pins at truss hardpoints) and is very compact, being only one bay square. The MRMS proposed is highly maneuverable and is able to move in any direction along the orthogonal guide pin array under complete control at all times. The proposed concept would greatly enhance the safety and operational capabilities of astronauts performing EVA functions such as structural assembly, payload transport and attachment, space station maintenance, repair or modification, and future spacecraft construction or servicing. The MRMS drive system conceptual design presented is a reasonably simple mechanical device which can be designed to exhibit high reliability. Developmentally, all components of the proposed MRMS either exist or are considered to be completely state of the art designs requiring minimal development, features which should enhance reliability and minimize costs.
Assessment of pre-gastroscopy fasting period using ultrasonography.
Spahn, Thomas Werner; Wessels, Anne; Grosse-Thie, Wolfram; Mueller, Michael Karl
2009-03-01
Discomfort is frequent in patients undergoing esophagogastroduodenoscopy who are routinely recommended to abstain at least for 6 h from liquid or solid food prior to the procedure. We investigated the minimal period of time required for the stomach to clear fluids in order to define a safe minimal pre-endoscopy fasting period. Gastric emptying was sonographically assessed in 54 patients by measurement of the antrum surface area prior to, immediately after, and 30, 60, and 90 min after ingestion of 300 ml water and water containing 75 g glucose or apple juice. Esophagogastroduodenoscopy was performed subsequently. Ingestion of water required 1 h for complete clearance. Three hundred milliliters glucose solution and apple juice were cleared more slowly, 90 min after drinking. Ingestion of water or glucose solution prior to esophagogastroduodenoscopy in patients without a history of gastric emptying dysfunction is safe when observing a 90 min latency period and might prevent discomfort.
Minimal conditions for the existence of a Hawking-like flux
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barcelo, Carlos; Liberati, Stefano; Sonego, Sebastiano
2011-02-15
We investigate the minimal conditions that an asymptotically flat general relativistic spacetime must satisfy in order for a Hawking-like Planckian flux of particles to arrive at future null infinity. We demonstrate that there is no requirement that any sort of horizon form anywhere in the spacetime. We find that the irreducible core requirement is encoded in an approximately exponential 'peeling' relationship between affine coordinates on past and future null infinity. As long as a suitable adiabaticity condition holds, then a Planck-distributed Hawking-like flux will arrive at future null infinity with temperature determined by the e-folding properties of the outgoing nullmore » geodesics. The temperature of the Hawking-like flux can slowly evolve as a function of time. We also show that the notion of peeling of null geodesics is distinct from the usual notion of 'inaffinity' used in Hawking's definition of surface gravity.« less
A decomposition approach to the design of a multiferroic memory bit
NASA Astrophysics Data System (ADS)
Acevedo, Ruben; Liang, Cheng-Yen; Carman, Gregory P.; Sepulveda, Abdon E.
2017-06-01
The objective of this paper is to present a methodology for the design of a memory bit to minimize the energy required to write data at the bit level. By straining a ferromagnetic nickel nano-dot by means of a piezoelectric substrate, its magnetization vector rotates between two stable states defined as a 1 and 0 for digital memory. The memory bit geometry, actuation mechanism and voltage control law were used as design variables. The approach used was to decompose the overall design process into simpler sub-problems whose structure can be exploited for a more efficient solution. This method minimizes the number of fully dynamic coupled finite element analyses required to converge to a near optimal design, thus decreasing the computational time for the design process. An in-plane sample design problem is presented to illustrate the advantages and flexibility of the procedure.
THREAT ENSEMBLE VULNERABILITY ASSESSMENT ...
software and manual TEVA-SPOT is used by water utilities to optimize the number and location of contamination detection sensors so that economic and/or public health consequences are minimized. TEVA-SPOT is interactive, allowing a user to specify the minimization objective (e.g., the number of people exposed, the time to detection, or the extent of pipe length contaminated). It also allows a user to specify constraints. For example, a TEVA-SPOT user can employ expert knowledge during the design process by identifying either existing or unfeasible sensor locations. Installation and maintenance costs for sensor placement can also be factored into the analysis. Python and Java are required to run TEVA-SPOT
Self-Alining Quick-Connect Joint
NASA Technical Reports Server (NTRS)
Lucy, M. H.
1983-01-01
Quick connect tapered joint used with minimum manipulation and force. Split ring retainer holds locking ring in place. Minimal force required to position male in female joint, at which time split-ring retainers are triggered to release split locking rings. Originally developed to assemble large space structures, joint is simple, compact, strong, lightweight, self alining, and has no loose parts.
AGARD Flight Test Techniques Series. Volume 7. Air-to-Air Radar Flight Testing
1988-06-01
enters the beam ), a different tilt angle should be used. The emphasis on setting the tilt angle may require a non - standard high accuracy tilt angle...is: the time from pilot designation on a non -maneuvering target to the time that the system achieves target range, range rate and angle tracking...minimal attenuation, distortion, or boresight Shift effects on the radar beam . Thus, radome design for airborne application io largely a process of
An iteration algorithm for optimal network flows
NASA Astrophysics Data System (ADS)
Woong, C. J.
1983-09-01
A packet switching network has the desirable feature of rapidly handling short (bursty) messages of the type often found in computer communication systems. In evaluating packet switching networks, the average time delay per packet is one of the most important measures of performance. The problem of message routing to minimize time delay is analyzed here using two approaches, called "successive saturation' and "max-slack', for various traffic requirement matrices and networks with fixed topology and link capacities.
NASA Astrophysics Data System (ADS)
Rizvi, Syed S.; Shah, Dipali; Riasat, Aasia
The Time Wrap algorithm [3] offers a run time recovery mechanism that deals with the causality errors. These run time recovery mechanisms consists of rollback, anti-message, and Global Virtual Time (GVT) techniques. For rollback, there is a need to compute GVT which is used in discrete-event simulation to reclaim the memory, commit the output, detect the termination, and handle the errors. However, the computation of GVT requires dealing with transient message problem and the simultaneous reporting problem. These problems can be dealt in an efficient manner by the Samadi's algorithm [8] which works fine in the presence of causality errors. However, the performance of both Time Wrap and Samadi's algorithms depends on the latency involve in GVT computation. Both algorithms give poor latency for large simulation systems especially in the presence of causality errors. To improve the latency and reduce the processor ideal time, we implement tree and butterflies barriers with the optimistic algorithm. Our analysis shows that the use of synchronous barriers such as tree and butterfly with the optimistic algorithm not only minimizes the GVT latency but also minimizes the processor idle time.
A Kernel-based Lagrangian method for imperfectly-mixed chemical reactions
NASA Astrophysics Data System (ADS)
Schmidt, Michael J.; Pankavich, Stephen; Benson, David A.
2017-05-01
Current Lagrangian (particle-tracking) algorithms used to simulate diffusion-reaction equations must employ a certain number of particles to properly emulate the system dynamics-particularly for imperfectly-mixed systems. The number of particles is tied to the statistics of the initial concentration fields of the system at hand. Systems with shorter-range correlation and/or smaller concentration variance require more particles, potentially limiting the computational feasibility of the method. For the well-known problem of bimolecular reaction, we show that using kernel-based, rather than Dirac delta, particles can significantly reduce the required number of particles. We derive the fixed width of a Gaussian kernel for a given reduced number of particles that analytically eliminates the error between kernel and Dirac solutions at any specified time. We also show how to solve for the fixed kernel size by minimizing the squared differences between solutions over any given time interval. Numerical results show that the width of the kernel should be kept below about 12% of the domain size, and that the analytic equations used to derive kernel width suffer significantly from the neglect of higher-order moments. The simulations with a kernel width given by least squares minimization perform better than those made to match at one specific time. A heuristic time-variable kernel size, based on the previous results, performs on par with the least squares fixed kernel size.
NASA Astrophysics Data System (ADS)
Tashakkori, H.; Rajabifard, A.; Kalantari, M.
2016-10-01
Search and rescue procedures for indoor environments are quite complicated due to the fact that much of the indoor information is unavailable to rescuers before physical entrance to the incident scene. Thus, decision making regarding the number of crew required and the way they should be dispatched in the building considering the various access points and complexities in the buildings in order to cover the search area in minimum time is dependent on prior knowledge and experience of the emergency commanders. Hence, this paper introduces the Search and Rescue Problem (SRP) which aims at finding best search and rescue routes that minimize the overall search time in the buildings. 3D BIM-oriented indoor GIS is integrated in the indoor route graph to find accurate routes based on the building geometric and semantic information. An Ant Colony Based Algorithm is presented that finds the number of first responders required and their individual routes to search all rooms and points of interest inside the building to minimize the overall time spent by all rescuers inside the disaster area. The evaluation of the proposed model for a case study building shows a significant improve in search and rescue time which will lead to a higher chance of saving lives and less exposure of emergency crew to danger.
Learning curve with minimally invasive unicompartmental knee arthroplasty.
Hamilton, William G; Ammeen, Deborah; Engh, C Anderson; Engh, Gerard A
2010-08-01
This study examined 445 consecutive minimally invasive unicompartmental knee arthroplasties (UKAs) from one institution to determine whether revision and reoperation rates would decrease as the number of cases performed increased, indicating the presence of a learning curve with this procedure. At a mean of 3.25 years, 26 knees required revision yielding an overall revision rate of 5.8%; survivorship at 2 years with revision as an end point was 96% +/- 1.7%. Both revisions and reoperations decreased over time but not significantly. For the first half of UKA cases performed vs the second half, revision rates fell from 5.0% to 2.5%, and reoperation rates fell from 8.1% to 5.4%. These data demonstrate that despite modifications made to improve surgical technique across time, a substantial complication rate with this procedure persists. Copyright (c) 2010 Elsevier Inc. All rights reserved.
Booth, David T; Evans, Andrew
2011-01-01
For sea turtles nesting on beaches surrounded by coral reefs, the most important element of hatchling recruitment is escaping predation by fish as they swim across the fringing reef, and as a consequence hatchlings that minimize their exposure to fish predation by minimizing the time spent crossing the fringing reef have a greater chance of surviving the reef crossing. One way to decrease the time required to cross the fringing reef is to maximize swimming speed. We found that both water temperature and nest temperature influence swimming performance of hatchling green turtles, but in opposite directions. Warm water increases swimming ability, with hatchling turtles swimming in warm water having a faster stroke rate, while an increase in nest temperature decreases swimming ability with hatchlings from warm nests producing less thrust per stroke.
Note: Four-port microfluidic flow-cell with instant sample switching
NASA Astrophysics Data System (ADS)
MacGriff, Christopher A.; Wang, Shaopeng; Tao, Nongjian
2013-10-01
A simple device for high-speed microfluidic delivery of liquid samples to a surface plasmon resonance sensor surface is presented. The delivery platform is comprised of a four-port microfluidic cell, two ports serve as inlets for buffer and sample solutions, respectively, and a high-speed selector valve to control the alternate opening and closing of the two outlet ports. The time scale of buffer/sample switching (or sample injection rise and fall time) is on the order of milliseconds, thereby minimizing the opportunity for sample plug dispersion. The high rates of mass transport to and from the central microfluidic sensing region allow for SPR-based kinetic analysis of binding events with dissociation rate constants (kd) up to 130 s-1. The required sample volume is only 1 μL, allowing for minimal sample consumption during high-speed kinetic binding measurement.
Rapid Pneumatic Transport of Radioactive Samples - RaPToRS
NASA Astrophysics Data System (ADS)
Padalino, S.; Barrios, M.; Sangster, C.
2005-10-01
Some ICF neutron activation diagnostics require quick retrieval of the activated sample. Minimizing retrieval times is particularly important when the half-life of the activated material is on the order of the transport time or the degree of radioactivity is close to the background counting level. These restrictions exist in current experiments performed at the Laboratory for Laser Energetics, thus motivating the development of the RaPToRS system. The system has been designed to minimize transportation time while requiring no human intervention during transport or counting. These factors will be important if the system is to be used at the NIF where radiological hazards will be present during post activation. The sample carrier is pneumatically transported via a 4 inch ID PVC pipe to a remote location in excess of 100 meters from the activation site at a speed of approximately 7 m/s. It arrives at an end station where it is dismounted robotically from the carrier and removed from its hermetic package. The sample is then placed by the robot in a counting station. This system is currently being developed to measure back-to-back gamma rays produced by positron annihilation which were emitted by activated graphite. Funded in part by the U.S. DOE under sub contract with LLE at the University of Rochester.
Systems Biology Perspectives on Minimal and Simpler Cells
Xavier, Joana C.; Patil, Kiran Raosaheb
2014-01-01
SUMMARY The concept of the minimal cell has fascinated scientists for a long time, from both fundamental and applied points of view. This broad concept encompasses extreme reductions of genomes, the last universal common ancestor (LUCA), the creation of semiartificial cells, and the design of protocells and chassis cells. Here we review these different areas of research and identify common and complementary aspects of each one. We focus on systems biology, a discipline that is greatly facilitating the classical top-down and bottom-up approaches toward minimal cells. In addition, we also review the so-called middle-out approach and its contributions to the field with mathematical and computational models. Owing to the advances in genomics technologies, much of the work in this area has been centered on minimal genomes, or rather minimal gene sets, required to sustain life. Nevertheless, a fundamental expansion has been taking place in the last few years wherein the minimal gene set is viewed as a backbone of a more complex system. Complementing genomics, progress is being made in understanding the system-wide properties at the levels of the transcriptome, proteome, and metabolome. Network modeling approaches are enabling the integration of these different omics data sets toward an understanding of the complex molecular pathways connecting genotype to phenotype. We review key concepts central to the mapping and modeling of this complexity, which is at the heart of research on minimal cells. Finally, we discuss the distinction between minimizing the number of cellular components and minimizing cellular complexity, toward an improved understanding and utilization of minimal and simpler cells. PMID:25184563
Mazzaglia, Giampiero; Straus, Sabine M J; Arlett, Peter; da Silva, Daniela; Janssen, Heidi; Raine, June; Alteri, Enrica
2018-02-01
Studies measuring the effectiveness of risk minimization measures (RMMs) submitted by pharmaceutical companies to the European Medicines Agency are part of the post-authorization regulatory requirements and represent an important source of data covering a range of medicinal products and safety-related issues. Their objectives, design, and the associated regulatory outcomes were reviewed, and conclusions were drawn that may support future progress in risk minimization evaluation. Information was obtained from risk management plans, study protocols, clinical study reports, and assessment reports of 157 medicinal products authorized for cardiovascular, endocrinology, and metabolic indications. We selected observational studies measuring, as outcomes of interest, the relationship between the RMMs in place and (1) implementation measures, such as clinical knowledge or physicians` compliance to recommendations contained in the RMMs; and (2) occurrence or reduced severity of the adverse drug reactions for which the RMMs were required. Of 59 eligible studies (24 completed, 35 ongoing), 44 assessed implementation measures, whereas only 15 assessed safety outcomes (1 study as a single endpoint and 14 studies with other endpoints). Fifty-one studies used non-experimental designs and 25 studies employed electronic healthcare databases for analysis. Of the 24 completed studies, 17 were considered satisfactory and supported immediate regulatory decision making, 6 were considered inconclusive and required new evaluations, and 1 was terminated early because new safety restrictions were required, thereby necessitating a new evaluation. Compliance with agreed deadlines was considered acceptable in 21 of 24 completed studies; the average time for a submission was 37 months (standard deviation ± 17), with differences observed by type of data source employed. Three important gaps in the evaluation plans of RMMs were identified: lack of early feedback on implementation, limited evaluation of safety outcomes, and inability to provide information on the effectiveness from an integrated measurement of different elements of a set of risk minimization tools. More robust evidence is needed to advance regulatory science and support more rapid adjustment of risk minimization strategies as needed.
Results of completion arteriography after minimally invasive off-pump coronary artery bypass.
Hoff, Steven J; Ball, Stephen K; Leacche, Marzia; Solenkova, Natalia; Umakanthan, Ramanan; Petracek, Michael R; Ahmad, Rashid; Greelish, James P; Walker, Kristie; Byrne, John G
2011-01-01
The benefits of a minimally invasive approach to off-pump coronary artery bypass remain controversial. The value of completion arteriography in validating this technique has not been investigated. From April 2007 to October 2009, fifty-six patients underwent isolated minimally invasive coronary artery bypass grafting through a left thoracotomy without cardiopulmonary bypass. Forty-three of these patients underwent completion arteriography. Sixty-five grafts were performed in these 56 patients, (average, 1.2 grafts per patient; range, 1 to 3). Forty-eight grafts were studied in the 43 patients undergoing completion arteriography. There were 4 findings on arteriogram leading to further immediate intervention (8.3%). These included 3 grafts with anastomotic stenoses or spasm requiring stent placement, and 1 patient who had limited dissection in the left internal mammary artery graft and underwent placement of an additional vein graft. These findings were independent of electrocardiographic changes or hemodynamic instability. The remainder of the studies showed no significant abnormalities. There were no deaths. One patient who did not have a completion arteriogram suffered a postoperative myocardial infarction requiring stent placement for anastomotic stenosis. Patients were discharged home an average of 6.8 days postoperatively. There were no instances of renal dysfunction postoperatively attributable to catheterization. Minimally invasive coronary artery bypass is safe and effective. Findings of completion arteriography occasionally reveal previously under-recognized findings that, if corrected in a timely fashion, could potentially impact graft patency and clinical outcomes. Our experience validates this minimally invasive technique. Copyright © 2011 The Society of Thoracic Surgeons. Published by Elsevier Inc. All rights reserved.
Nakata, Bruce Negrello; Cavalini, Worens; Bonin, Eduardo A; Salvalaggio, Paolo R; Loureiro, Marcelo P
2017-10-01
Minimally invasive surgery (MIS) requires the mastery of manual skills and a specific training is required. Apart from residencies and fellowships in MIS, other learning opportunities utilize massive training, mainly with use of simulators in short courses. A long-term postgraduate course represents an opportunity to learn through training using distributed practice. The objective of this study is to assess the use of distributed practice for acquisition of basic minimally invasive skills in surgeons who participated in a long-term MIS postgraduate course. A prospective, longitudinal and quantitative study was conducted among surgeons who attended a 1-year postgraduate course of MIS in Brazil, from 2012 to 2014. They were tested through five different exercises in box trainers (peg-transfer, passing, cutting, intracorporeal knot, and suture) in the first (t0), fourth (t1) and last, eighth, (t2) meetings of this course. The time and penalties of each exercise were collected for each participant. Participant skills were assessed based on time and accuracy on a previously tested score. Fifty-seven surgeons (participants) from three consecutive groups participated in this study. There was a significant improvement in scores in all exercises. The average increase in scores between t0 and t2 was 88% for peg-transfer, 174% for passing, 149% for cutting, 130% for intracorporeal knot, and 120% for suture (p < 0.001 for all exercises). Learning through distributed practice is effective and should be integrated into a MIS postgraduate course curriculum for acquisition of core skills.
Probabilistic sparse matching for robust 3D/3D fusion in minimally invasive surgery.
Neumann, Dominik; Grbic, Sasa; John, Matthias; Navab, Nassir; Hornegger, Joachim; Ionasec, Razvan
2015-01-01
Classical surgery is being overtaken by minimally invasive and transcatheter procedures. As there is no direct view or access to the affected anatomy, advanced imaging techniques such as 3D C-arm computed tomography (CT) and C-arm fluoroscopy are routinely used in clinical practice for intraoperative guidance. However, due to constraints regarding acquisition time and device configuration, intraoperative modalities have limited soft tissue image quality and reliable assessment of the cardiac anatomy typically requires contrast agent, which is harmful to the patient and requires complex acquisition protocols. We propose a probabilistic sparse matching approach to fuse high-quality preoperative CT images and nongated, noncontrast intraoperative C-arm CT images by utilizing robust machine learning and numerical optimization techniques. Thus, high-quality patient-specific models can be extracted from the preoperative CT and mapped to the intraoperative imaging environment to guide minimally invasive procedures. Extensive quantitative experiments on 95 clinical datasets demonstrate that our model-based fusion approach has an average execution time of 1.56 s, while the accuracy of 5.48 mm between the anchor anatomy in both images lies within expert user confidence intervals. In direct comparison with image-to-image registration based on an open-source state-of-the-art medical imaging library and a recently proposed quasi-global, knowledge-driven multi-modal fusion approach for thoracic-abdominal images, our model-based method exhibits superior performance in terms of registration accuracy and robustness with respect to both target anatomy and anchor anatomy alignment errors.
The ETA-2 induction linac as a high average power FEL driver
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nexsen, W.E.; Atkinson, D.P.; Barrett, D.M.
1989-10-16
The Experimental Test Accelerator-II (ETA-II) is the first induction linac designed specifically to FEL requirements. It primarily is intended to demonstrate induction accelerator technology for high average power, high brightness electron beams, and will be used to drive a 140 and 250 GHz microwave FEL for plasma heating experiments in the Microwave Tokamak Experiment (MTX) at LLNL. Its features include high vacuum design which allows the use of an intrinsically bright dispenser cathode, induction cells designed to minimize BBU growth rate, and careful attention to magnetic alignment to minimize radial sweep due to beam corkscrew. The use of magnetic switchesmore » allows high average power operation. At present ETA-II is being used to drive 140 GHz plasma heating experiments. These experiments require nominal beam parameters of 6 Mev energy, 2kA current, 20ns pulse width and a brightness of 1 {times} 10{sup 8} A/(m-rad){sup 2} at the wiggler with a pulse repetition frequency (PRF) of 0.5 Hz. Future 250 GHz experiments require beam parameters of 10 Mev energy, 3kA current, 50ns pulse width and a brightness of 1 {times} 10{sup 8} A/(m-rad){sup 2} with a 5 kHz PRF for 0.5 sec. In this paper we discuss the present status of ETA-II parameters and the phased development program necessary to satisfy these future requirements. 13 refs., 9 figs., 1 tab.« less
ECO fill: automated fill modification to support late-stage design changes
NASA Astrophysics Data System (ADS)
Davis, Greg; Wilson, Jeff; Yu, J. J.; Chiu, Anderson; Chuang, Yao-Jen; Yang, Ricky
2014-03-01
One of the most critical factors in achieving a positive return for a design is ensuring the design not only meets performance specifications, but also produces sufficient yield to meet the market demand. The goal of design for manufacturability (DFM) technology is to enable designers to address manufacturing requirements during the design process. While new cell-based, DP-aware, and net-aware fill technologies have emerged to provide the designer with automated fill engines that support these new fill requirements, design changes that arrive late in the tapeout process (as engineering change orders, or ECOs) can have a disproportionate effect on tapeout schedules, due to the complexity of replacing fill. If not handled effectively, the impacts on file size, run time, and timing closure can significantly extend the tapeout process. In this paper, the authors examine changes to design flow methodology, supported by new fill technology, that enable efficient, fast, and accurate adjustments to metal fill late in the design process. We present an ECO fill methodology coupled with the support of advanced fill tools that can quickly locate the portion of the design affected by the change, remove and replace only the fill in that area, while maintaining the fill hierarchy. This new fill approach effectively reduces run time, contains fill file size, minimizes timing impact, and minimizes mask costs due to ECO-driven fill changes, all of which are critical factors to ensuring time-to-market schedules are maintained.
An efficient graph theory based method to identify every minimal reaction set in a metabolic network
2014-01-01
Background Development of cells with minimal metabolic functionality is gaining importance due to their efficiency in producing chemicals and fuels. Existing computational methods to identify minimal reaction sets in metabolic networks are computationally expensive. Further, they identify only one of the several possible minimal reaction sets. Results In this paper, we propose an efficient graph theory based recursive optimization approach to identify all minimal reaction sets. Graph theoretical insights offer systematic methods to not only reduce the number of variables in math programming and increase its computational efficiency, but also provide efficient ways to find multiple optimal solutions. The efficacy of the proposed approach is demonstrated using case studies from Escherichia coli and Saccharomyces cerevisiae. In case study 1, the proposed method identified three minimal reaction sets each containing 38 reactions in Escherichia coli central metabolic network with 77 reactions. Analysis of these three minimal reaction sets revealed that one of them is more suitable for developing minimal metabolism cell compared to other two due to practically achievable internal flux distribution. In case study 2, the proposed method identified 256 minimal reaction sets from the Saccharomyces cerevisiae genome scale metabolic network with 620 reactions. The proposed method required only 4.5 hours to identify all the 256 minimal reaction sets and has shown a significant reduction (approximately 80%) in the solution time when compared to the existing methods for finding minimal reaction set. Conclusions Identification of all minimal reactions sets in metabolic networks is essential since different minimal reaction sets have different properties that effect the bioprocess development. The proposed method correctly identified all minimal reaction sets in a both the case studies. The proposed method is computationally efficient compared to other methods for finding minimal reaction sets and useful to employ with genome-scale metabolic networks. PMID:24594118
Mathers, Bradley; Moyer, Matthew; Mathew, Abraham; Dye, Charles; Levenick, John; Gusani, Niraj; Dougherty-Hamod, Brandy; McGarrity, Thomas
2016-01-01
Direct percutaneous endoscopic necrosectomy has been described as a minimally invasive intervention for the debridement of walled-off pancreatic necrosis (WOPN). In this retrospective cohort study, we aimed to confirm these findings in a US referral center and evaluate the clinical value of this modality in the treatment of pancreatic necrosis as well as other types of intra-abdominal fluid collections and necrosis. Twelve consecutive patients with WOPN or other abdominal abscess requiring debridement and washout underwent computed tomography (CT)-guided drainage catheter placement. Each patient then underwent direct percutaneous endoscopic necrosectomy and washout with repeat debridement performed until complete. Drains were then removed once output fell below 30 mL/day and imaging confirmed resolution. The primary endpoints were time to clinical resolution and sustained resolution at 1-year follow up. Ten patients were treated for WOPN, one for necrotic hepatic abscesses, and one for omental necrosis. The median time to intervention was 85 days with an average of 2.3 necrosectomies performed. Complete removal of drains was accomplished in 11 patients (92 %). The median time to resolution was 57 days. No serious adverse events occurred; however, one patient developed pancreaticocutaneous fistulas. Ten patients completed 1-year surveillance of which none required drain replacement. No patients required surgery or repeat endoscopy. This series supports the premise that direct percutaneous endoscopic necrosectomy is a safe and effective intervention for intra-abdominal fluid collections and necrosis in appropriately selected patients. Our study demonstrates a high clinical success rate with minimal adverse events. This modality offers several potential advantages over surgical and transgastric approaches including use of improved accessibility, an excellent safety profile, and requirement for only deep or moderate sedation.
Liu, Jie; Guo, Liang; Jiang, Jiping; Jiang, Dexun; Wang, Peng
2018-04-13
Aiming to minimize the damage caused by river chemical spills, efficient emergency material allocation is critical for an actual emergency rescue decision-making in a quick response. In this study, an emergency material allocation framework based on time-varying supply-demand constraint is developed to allocate emergency material, minimize the emergency response time, and satisfy the dynamic emergency material requirements in post-accident phases dealing with river chemical spills. In this study, the theoretically critical emergency response time is firstly obtained for the emergency material allocation system to select a series of appropriate emergency material warehouses as potential supportive centers. Then, an enumeration method is applied to identify the practically critical emergency response time, the optimum emergency material allocation and replenishment scheme. Finally, the developed framework is applied to a computational experiment based on south-to-north water transfer project in China. The results illustrate that the proposed methodology is a simple and flexible tool for appropriately allocating emergency material to satisfy time-dynamic demands during emergency decision-making. Therefore, the decision-makers can identify an appropriate emergency material allocation scheme in a balance between time-effective and cost-effective objectives under the different emergency pollution conditions.
Simulation Evaluation of Pilot Inputs for Real Time Modeling During Commercial Flight Operations
NASA Technical Reports Server (NTRS)
Martos, Borja; Ranaudo, Richard; Oltman, Ryan; Myhre, Nick
2017-01-01
Aircraft dynamics characteristics can only be identified from flight data when the aircraft dynamics are excited sufficiently. A preliminary study was conducted into what types and levels of manual piloted control excitation would be required for accurate Real-Time Parameter IDentification (RTPID) results by commercial airline pilots. This includes assessing the practicality for the pilot to provide this excitation when cued, and to further understand if pilot inputs during various phases of flight provide sufficient excitation naturally. An operationally representative task was evaluated by 5 commercial airline pilots using the NASA Ice Contamination Effects Flight Training Device (ICEFTD). Results showed that it is practical to use manual pilot inputs only as a means of achieving good RTPID in all phases of flight and in flight turbulence conditions. All pilots were effective in satisfying excitation requirements when cued. Much of the time, cueing was not even necessary, as just performing the required task provided enough excitation for accurate RTPID estimation. Pilot opinion surveys reported that the additional control inputs required when prompted by the excitation cueing were easy to make, quickly mastered, and required minimal training.
NASA Technical Reports Server (NTRS)
Kanning, G.; Cicolani, L. S.; Schmidt, S. F.
1983-01-01
Translational state estimation in terminal area operations, using a set of commonly available position, air data, and acceleration sensors, is described. Kalman filtering is applied to obtain maximum estimation accuracy from the sensors but feasibility in real-time computations requires a variety of approximations and devices aimed at minimizing the required computation time with only negligible loss of accuracy. Accuracy behavior throughout the terminal area, its relation to sensor accuracy, its effect on trajectory tracking errors and control activity in an automatic flight control system, and its adequacy in terms of existing criteria for various terminal area operations are examined. The principal investigative tool is a simulation of the system.
NASA Astrophysics Data System (ADS)
Steingroewer, Juliane; Bley, Thomas; Bergemann, Christian; Boschke, Elke
2007-04-01
Analyses of food-borne pathogens are of great importance in order to minimize the health risk for customers. Thus, very sensitive and rapid detection methods are required. Current conventional culture techniques are very time consuming. Modern immunoassays and biochemical analysis also require pre-enrichment steps resulting in a turnaround time of at least 24 h. Biomagnetic separation (BMS) is a promising more rapid method. In this study we describe the isolation of high affine and specific peptides from a phage-peptide library, which combined with BMS allows the detection of Salmonella spp. with a similar sensitivity as that of immunomagnetic separation using antibodies.
Cost-efficient scheduling of FAST observations
NASA Astrophysics Data System (ADS)
Luo, Qi; Zhao, Laiping; Yu, Ce; Xiao, Jian; Sun, Jizhou; Zhu, Ming; Zhong, Yi
2018-03-01
A cost-efficient schedule for the Five-hundred-meter Aperture Spherical radio Telescope (FAST) requires to maximize the number of observable proposals and the overall scientific priority, and minimize the overall slew-cost generated by telescope shifting, while taking into account the constraints including the astronomical objects visibility, user-defined observable times, avoiding Radio Frequency Interference (RFI). In this contribution, first we solve the problem of maximizing the number of observable proposals and scientific priority by modeling it as a Minimum Cost Maximum Flow (MCMF) problem. The optimal schedule can be found by any MCMF solution algorithm. Then, for minimizing the slew-cost of the generated schedule, we devise a maximally-matchable edges detection-based method to reduce the problem size, and propose a backtracking algorithm to find the perfect matching with minimum slew-cost. Experiments on a real dataset from NASA/IPAC Extragalactic Database (NED) show that, the proposed scheduler can increase the usage of available times with high scientific priority and reduce the slew-cost significantly in a very short time.
Sloped terrain segmentation for autonomous drive using sparse 3D point cloud.
Cho, Seoungjae; Kim, Jonghyun; Ikram, Warda; Cho, Kyungeun; Jeong, Young-Sik; Um, Kyhyun; Sim, Sungdae
2014-01-01
A ubiquitous environment for road travel that uses wireless networks requires the minimization of data exchange between vehicles. An algorithm that can segment the ground in real time is necessary to obtain location data between vehicles simultaneously executing autonomous drive. This paper proposes a framework for segmenting the ground in real time using a sparse three-dimensional (3D) point cloud acquired from undulating terrain. A sparse 3D point cloud can be acquired by scanning the geography using light detection and ranging (LiDAR) sensors. For efficient ground segmentation, 3D point clouds are quantized in units of volume pixels (voxels) and overlapping data is eliminated. We reduce nonoverlapping voxels to two dimensions by implementing a lowermost heightmap. The ground area is determined on the basis of the number of voxels in each voxel group. We execute ground segmentation in real time by proposing an approach to minimize the comparison between neighboring voxels. Furthermore, we experimentally verify that ground segmentation can be executed at about 19.31 ms per frame.
Freundt, Miriam; Ried, Michael; Philipp, Alois; Diez, Claudius; Kolat, Philipp; Hirt, Stephan W; Schmid, Christof; Haneya, Assad
2016-03-01
Advanced age is a known risk factor for morbidity and mortality after coronary artery bypass grafting (CABG). Minimized extracorporeal circulation (MECC) has been shown to reduce the negative effects associated with conventional extracorporeal circulation (CECC). This trial assesses the impact of MECC on the outcome of elderly patients undergoing CABG. Eight hundred and seventy-five patients (mean age 78.35 years) underwent isolated CABG using CECC (n=345) or MECC (n=530). The MECC group had a significantly shorter extracorporeal circulation time (ECCT), cross-clamp time and reperfusion time and lower transfusion needs. Postoperatively, these patients required significantly less inotropic support, fewer blood transfusions, less postoperative hemodialysis and developed less delirium compared to CECC patients. In the MECC group, intensive care unit (ICU) stay was significantly shorter and 30-day mortality was significantly reduced [2.6% versus 7.8%; p<0.001]. In conclusion, MECC improves outcome in elderly patients undergoing CABG surgery. © The Author(s) 2015.
2015-01-01
Complex RNA structures are constructed from helical segments connected by flexible loops that move spontaneously and in response to binding of small molecule ligands and proteins. Understanding the conformational variability of RNA requires the characterization of the coupled time evolution of interconnected flexible domains. To elucidate the collective molecular motions and explore the conformational landscape of the HIV-1 TAR RNA, we describe a new methodology that utilizes energy-minimized structures generated by the program “Fragment Assembly of RNA with Full-Atom Refinement (FARFAR)”. We apply structural filters in the form of experimental residual dipolar couplings (RDCs) to select a subset of discrete energy-minimized conformers and carry out principal component analyses (PCA) to corroborate the choice of the filtered subset. We use this subset of structures to calculate solution T1 and T1ρ relaxation times for 13C spins in multiple residues in different domains of the molecule using two simulation protocols that we previously published. We match the experimental T1 times to within 2% and the T1ρ times to within less than 10% for helical residues. These results introduce a protocol to construct viable dynamic trajectories for RNA molecules that accord well with experimental NMR data and support the notion that the motions of the helical portions of this small RNA can be described by a relatively small number of discrete conformations exchanging over time scales longer than 1 μs. PMID:24479561
1987-03-01
3/4 hours. Performance tests evaluated simple and choice reaction time to visual stimuli, vigilance, and processing of symbolic, numerical, verbal...minimize the adverse consequences of these stressors. Tyrosine enhanced performance (e.g. complex information processing , vigilance, and reaction time... processes inherent in many real-world tasks. For example, Map Compass requires association of Wsi PL AFCm uA O-SV CHETCLtISS) direction and degree
Force-Free Time-Harmonic Plasmoids
1992-10-01
effect of currents or vortical motion are absolutely required for stability. What makes the present model attractive is the minimization of the body ...radiative-mode effects may be very fruitful in the future. For example: Rigid non-radiative composite "particles" containing large numbers of fus- able...12 7. The neutral plasma .......... .......................... 12 8. Forces on a moving electron ....... ......... .............. 13 9. Effects of
ERIC Educational Resources Information Center
Bardwell, John D.
This study sought to identify physical facilities needed to connect the six New England land-grant universities. Criteria were time (use of current technology), cost (regular operating budgets of participating institutions), minimal personnel requirements, flexibility, and compatibility. The telephone system, an existing microwave network, a…
Sink or Swim: Learning by Doing in a Supply Chain Integration Activity*
ERIC Educational Resources Information Center
Harnowo, Akhadian S.; Calhoun, Mikelle A.; Monteiro, Heather
2016-01-01
Studies show that supply chain integration (SCI) is important to organizations. This article describes an activity that places students in the middle of an SCI scenario. The highly interactive hands-on simulation requires only 50 to 60 minutes of classroom time, may be used with 18 to about 36 students, and involves minimal instructor preparation.…
Minimal-Approximation-Based Decentralized Backstepping Control of Interconnected Time-Delay Systems.
Choi, Yun Ho; Yoo, Sung Jin
2016-12-01
A decentralized adaptive backstepping control design using minimal function approximators is proposed for nonlinear large-scale systems with unknown unmatched time-varying delayed interactions and unknown backlash-like hysteresis nonlinearities. Compared with existing decentralized backstepping methods, the contribution of this paper is to design a simple local control law for each subsystem, consisting of an actual control with one adaptive function approximator, without requiring the use of multiple function approximators and regardless of the order of each subsystem. The virtual controllers for each subsystem are used as intermediate signals for designing a local actual control at the last step. For each subsystem, a lumped unknown function including the unknown nonlinear terms and the hysteresis nonlinearities is derived at the last step and is estimated by one function approximator. Thus, the proposed approach only uses one function approximator to implement each local controller, while existing decentralized backstepping control methods require the number of function approximators equal to the order of each subsystem and a calculation of virtual controllers to implement each local actual controller. The stability of the total controlled closed-loop system is analyzed using the Lyapunov stability theorem.
Reagor, James A; Holt, David W
2016-03-01
Advances in technology, the desire to minimize blood product transfusions, and concerns relating to inflammatory mediators have lead many practitioners and manufacturers to minimize cardiopulmonary bypass (CBP) circuit designs. The oxygenator and arterial line filter (ALF) have been integrated into one device as a method of attaining a reduction in prime volume and surface area. The instructions for use of a currently available oxygenator with integrated ALF recommends incorporating a recirculation line distal to the oxygenator. However, according to an unscientific survey, 70% of respondents utilize CPB circuits incorporating integrated ALFs without a path of recirculation distal to the oxygenator outlet. Considering this circuit design, the ability to quickly remove a gross air bolus in the blood path distal to the oxygenator may be compromised. This in vitro study was designed to determine if the time required to remove a gross air bolus from a CPB circuit without a path of recirculation distal to the oxygenator will be significantly longer than that of a circuit with a path of recirculation distal to the oxygenator. A significant difference was found in the mean time required to remove a gross air bolus between the circuit designs (p = .0003). Additionally, There was found to be a statistically significant difference in the mean time required to remove a gross air bolus between Trial 1 and Trials 4 (p = .015) and 5 (p =.014) irrespective of the circuit design. Under the parameters of this study, a recirculation line distal to an oxygenator with an integrated ALF significantly decreases the time it takes to remove an air bolus from the CPB circuit and may be safer for clinical use than the same circuit without a recirculation line.
Salkin, J A; Stuchin, S A; Kummer, F J; Reininger, R
1995-11-01
Five types of commercial glove liners (within double latex gloves) were compared to single and double latex gloves for cut and puncture resistance and for relative manual dexterity and degree of sensibility. An apparatus was constructed to test glove-pseudofinger constructs in either a cutting or puncture mode. Cutting forces, cutting speed, and type of blade (serrated or scalpel blade) were varied and the time to cut-through measured by an electrical conductivity circuit. Penetration forces were similarly determined with a scalpel blade and a suture needle using a spring scale loading apparatus. Dexterity was measured with an object placement task among a group of orthopedic surgeons. Sensibility was assessed with Semmes-Weinstein monofilaments, two-point discrimination, and vibrametry using standard techniques and rating scales. A subjective evaluation was performed at the end of testing. Time to cut-through for the liners ranged from 2 to 30 seconds for a rapid oscillating scalpel and 4 to 40 seconds for a rapid oscillating serrated knife under minimal loads. When a 1 kg load was added, times to cut-through ranged from 0.4 to 1.0 second. In most cases, the liners were superior to double latex. On average, 100% more force was required to penetrate the liners with a scalpel and 50% more force was required to penetrate the liners with a suture needle compared to double latex. Object placement task times were not significantly liners compared to double latex gloves. Semmes-Weinstein monofilaments, two-point discrimination, and vibrametry showed no difference in sensibility among the various liners and double latex gloves. Subjects felt that the liners were minimally to moderately impairing. An acclimation period may be required for their effective use.
Immobilization Techniques to Avoid Enzyme Loss from Oxidase-Based Biosensors: A One-Year Study
House, Jody L.; Anderson, Ellen M.; Ward, W. Kenneth
2007-01-01
Background Continuous amperometric sensors that measure glucose or lactate require a stable sensitivity, and glutaraldehyde crosslinking has been used widely to avoid enzyme loss. Nonetheless, little data is published on the effectiveness of enzyme immobilization with glutaraldehyde. Methods A combination of electrochemical testing and spectrophotometric assays was used to study the relationship between enzyme shedding and the fabrication procedure. In addition, we studied the relationship between the glutaraldehyde concentration and sensor performance over a period of one year. Results The enzyme immobilization process by glutaraldehyde crosslinking to glucose oxidase appears to require at least 24-hours at room temperature to reach completion. In addition, excess free glucose oxidase can be removed by soaking sensors in purified water for 20 minutes. Even with the addition of these steps, however, it appears that there is some free glucose oxidase entrapped within the enzyme layer which contributes to a decline in sensitivity over time. Although it reduces the ultimate sensitivity (probably via a change in the enzyme's natural conformation), glutaraldehyde concentration in the enzyme layer can be increased in order to minimize this instability. Conclusions After exposure of oxidase enzymes to glutaraldehyde, effective crosslinking requires a rinse step and a 24-hour incubation step. In order to minimize the loss of sensor sensitivity over time, the glutaraldehyde concentration can be increased. PMID:19888375
Pietsch, M; Djahani, O; Zweiger, Ch; Plattner, F; Radl, R; Tschauner, Ch; Hofmann, S
2013-10-01
Recently, new custom-fit pin guides in total knee arthroplasty (TKA) have been introduced. Use of these guides may reduce operating time. Use of the guides combined with the absence of intramedullary alignment jigs may lead to reduced blood loss and improved early outcomes. Our aim was to evaluate blood loss and early clinical outcomes in patients undergoing minimally invasive TKA using custom-fit magnetic resonance imaging (MRI)-based pin guides. A prospective study in 80 patients was carried out. Patients were divided randomly into 2 equal groups. In one group, intramedullary alignment jigs were used. In the second group, custom-fit MRI-based pin guides were used. All patients received the same cemented posterior-stabilized implant through a mini-midvastus approach. The volume in the drain bottles was recorded after 48 h. Hb loss was estimated by subtracting the postoperative from the preoperative Hb level. Transfusion requirements and surgical time were recorded. Outcome measures were Knee Society Scores (KSS), knee flexion, knee swelling and pain. There was lower mean drainage of blood in the custom-fit group (391 ml vs. 603 ml; p < 0.0001). There was no difference in estimated loss of Hb (3.6 g/dl vs. 4.1 g/dl; n.s.) and in transfusion requirements (7.5 % vs. 10 %; n.s.). Surgical time was reduced in the custom-fit group (12 min less; p = 0.001). KSS measured at week 2, 6 and 12 showed no significant difference between groups. Knee flexion measured on days 7, 10 and at week 6, 12 and knee swelling and pain measured on days 1, 3, 10 and at week 6, 12 showed no significant difference between groups. Using custom-fit pin guides reduces blood drainage, but not the estimated Hb loss in minimally invasive TKA and does not affect transfusion rate. Surgical time is reduced. There is no effect on the early clinical outcomes. Therapeutic study, Level I.
Prince, Linda M
2015-01-01
Inter-simple sequence repeat PCR (ISSR-PCR) is a fast, inexpensive genotyping technique based on length variation in the regions between microsatellites. The method requires no species-specific prior knowledge of microsatellite location or composition. Very small amounts of DNA are required, making this method ideal for organisms of conservation concern, or where the quantity of DNA is extremely limited due to organism size. ISSR-PCR can be highly reproducible but requires careful attention to detail. Optimization of DNA extraction, fragment amplification, and normalization of fragment peak heights during fluorescent detection are critical steps to minimizing the downstream time spent verifying and scoring the data.
Fan, Ching-Lin; Shang, Ming-Chi; Li, Bo-Jyun; Lin, Yu-Zuo; Wang, Shea-Jue; Lee, Win-Der
2014-08-11
Minimizing the parasitic capacitance and the number of photo-masks can improve operational speed and reduce fabrication costs. Therefore, in this study, a new two-photo-mask process is proposed that exhibits a self-aligned structure without an etching-stop layer. Combining the backside-ultraviolet (BUV) exposure and backside-lift-off (BLO) schemes can not only prevent the damage when etching the source/drain (S/D) electrodes but also reduce the number of photo-masks required during fabrication and minimize the parasitic capacitance with the decreasing of gate overlap length at same time. Compared with traditional fabrication processes, the proposed process yields that thin-film transistors (TFTs) exhibit comparable field-effect mobility (9.5 cm²/V·s), threshold voltage (3.39 V), and subthreshold swing (0.3 V/decade). The delay time of an inverter fabricated using the proposed process was considerably decreased.
Fan, Ching-Lin; Shang, Ming-Chi; Li, Bo-Jyun; Lin, Yu-Zuo; Wang, Shea-Jue; Lee, Win-Der
2014-01-01
Minimizing the parasitic capacitance and the number of photo-masks can improve operational speed and reduce fabrication costs. Therefore, in this study, a new two-photo-mask process is proposed that exhibits a self-aligned structure without an etching-stop layer. Combining the backside-ultraviolet (BUV) exposure and backside-lift-off (BLO) schemes can not only prevent the damage when etching the source/drain (S/D) electrodes but also reduce the number of photo-masks required during fabrication and minimize the parasitic capacitance with the decreasing of gate overlap length at same time. Compared with traditional fabrication processes, the proposed process yields that thin-film transistors (TFTs) exhibit comparable field-effect mobility (9.5 cm2/V·s), threshold voltage (3.39 V), and subthreshold swing (0.3 V/decade). The delay time of an inverter fabricated using the proposed process was considerably decreased. PMID:28788159
2017-01-01
Abstract Hepatic encephalopathy (HE) is a reversible syndrome of impaired brain function occurring in patients with advanced liver diseases. The precise pathophysiology of HE is still under discussion; the leading hypothesis focus on the role of neurotoxins, impaired neurotransmission due to metabolic changes in liver failure, changes in brain energy metabolism, systemic inflammatory response and alterations of the blood brain barrier. HE produces a wide spectrum of nonspecific neurological and psychiatric manifestations. Minimal HE is diagnosed by abnormal psychometric tests. Clinically overt HE includes personality changes, alterations in consciousness progressive disorientation in time and space, somnolence, stupor and, finally, coma. Except for clinical studies, no specific tests are required for diagnosis. HE is classified according to the underlying disease, the severity of manifestations, its time course and the existence of precipitating factors. Treatment of overt HE includes supportive therapies, treatment of precipitating factors, lactulose and/or rifaximin. Routine treatment for minimal HE is only recommended for selected patients. PMID:28533911
Systems biology perspectives on minimal and simpler cells.
Xavier, Joana C; Patil, Kiran Raosaheb; Rocha, Isabel
2014-09-01
The concept of the minimal cell has fascinated scientists for a long time, from both fundamental and applied points of view. This broad concept encompasses extreme reductions of genomes, the last universal common ancestor (LUCA), the creation of semiartificial cells, and the design of protocells and chassis cells. Here we review these different areas of research and identify common and complementary aspects of each one. We focus on systems biology, a discipline that is greatly facilitating the classical top-down and bottom-up approaches toward minimal cells. In addition, we also review the so-called middle-out approach and its contributions to the field with mathematical and computational models. Owing to the advances in genomics technologies, much of the work in this area has been centered on minimal genomes, or rather minimal gene sets, required to sustain life. Nevertheless, a fundamental expansion has been taking place in the last few years wherein the minimal gene set is viewed as a backbone of a more complex system. Complementing genomics, progress is being made in understanding the system-wide properties at the levels of the transcriptome, proteome, and metabolome. Network modeling approaches are enabling the integration of these different omics data sets toward an understanding of the complex molecular pathways connecting genotype to phenotype. We review key concepts central to the mapping and modeling of this complexity, which is at the heart of research on minimal cells. Finally, we discuss the distinction between minimizing the number of cellular components and minimizing cellular complexity, toward an improved understanding and utilization of minimal and simpler cells. Copyright © 2014, American Society for Microbiology. All Rights Reserved.
Monteverdi, B
2001-01-01
The explosive growth of handheld personal digital assistants (PDAs) in health care has been nothing short of amazing. What applications--business and clinical--do these devices have in medicine, and what is their potential? PDAs are simple and intuitive; their applications require minimal interaction time, so they have minimal impact on work flow: the investment is small; and the lightweight form is relatively nonintrusive during a patient encounter. The devices are being used to capture charges for medical services at the point of care. Encounter capture, online prescription writing and other applications will soon come on the scene. This article discusses current and possible future uses for PDAs in health care, interfaces with other technologies and security concerns.
Minimal Increase Network Coding for Dynamic Networks.
Zhang, Guoyin; Fan, Xu; Wu, Yanxia
2016-01-01
Because of the mobility, computing power and changeable topology of dynamic networks, it is difficult for random linear network coding (RLNC) in static networks to satisfy the requirements of dynamic networks. To alleviate this problem, a minimal increase network coding (MINC) algorithm is proposed. By identifying the nonzero elements of an encoding vector, it selects blocks to be encoded on the basis of relationship between the nonzero elements that the controls changes in the degrees of the blocks; then, the encoding time is shortened in a dynamic network. The results of simulations show that, compared with existing encoding algorithms, the MINC algorithm provides reduced computational complexity of encoding and an increased probability of delivery.
Minimal Increase Network Coding for Dynamic Networks
Wu, Yanxia
2016-01-01
Because of the mobility, computing power and changeable topology of dynamic networks, it is difficult for random linear network coding (RLNC) in static networks to satisfy the requirements of dynamic networks. To alleviate this problem, a minimal increase network coding (MINC) algorithm is proposed. By identifying the nonzero elements of an encoding vector, it selects blocks to be encoded on the basis of relationship between the nonzero elements that the controls changes in the degrees of the blocks; then, the encoding time is shortened in a dynamic network. The results of simulations show that, compared with existing encoding algorithms, the MINC algorithm provides reduced computational complexity of encoding and an increased probability of delivery. PMID:26867211
A system architecture for online data interpretation and reduction in fluorescence microscopy
NASA Astrophysics Data System (ADS)
Röder, Thorsten; Geisbauer, Matthias; Chen, Yang; Knoll, Alois; Uhl, Rainer
2010-01-01
In this paper we present a high-throughput sample screening system that enables real-time data analysis and reduction for live cell analysis using fluorescence microscopy. We propose a novel system architecture capable of analyzing a large amount of samples during the experiment and thus greatly minimizing the post-analysis phase that is the common practice today. By utilizing data reduction algorithms, relevant information of the target cells is extracted from the online collected data stream, and then used to adjust the experiment parameters in real-time, allowing the system to dynamically react on changing sample properties and to control the microscope setup accordingly. The proposed system consists of an integrated DSP-FPGA hybrid solution to ensure the required real-time constraints, to execute efficiently the underlying computer vision algorithms and to close the perception-action loop. We demonstrate our approach by addressing the selective imaging of cells with a particular combination of markers. With this novel closed-loop system the amount of superfluous collected data is minimized, while at the same time the information entropy increases.
Analytical solutions to non-Fickian subsurface dispersion in uniform groundwater flow
Zou, S.; Xia, J.; Koussis, Antonis D.
1996-01-01
Analytical solutions are obtained by the Fourier transform technique for the one-, two-, and three-dimensional transport of a conservative solute injected instantaneously in a uniform groundwater flow. These solutions account for dispersive non-linearity caused by the heterogeneity of the hydraulic properties of aquifer systems and can be used as building blocks to construct solutions by convolution (principle of superposition) for source conditions other than slug injection. The dispersivity is assumed to vary parabolically with time and is thus constant for the entire system at any given time. Two approaches for estimating time-dependent dispersion parameters are developed for two-dimensional plumes. They both require minimal field tracer test data and, therefore, represent useful tools for assessing real-world aquifer contamination sites. The first approach requires mapped plume-area measurements at two specific times after the tracer injection. The second approach requires concentration-versus-time data from two sampling wells through which the plume passes. Detailed examples and comparisons with other procedures show that the methods presented herein are sufficiently accurate and easier to use than other available methods.
Joint minimization of uplink and downlink whole-body exposure dose in indoor wireless networks.
Plets, D; Joseph, W; Vanhecke, K; Vermeeren, G; Wiart, J; Aerts, S; Varsier, N; Martens, L
2015-01-01
The total whole-body exposure dose in indoor wireless networks is minimized. For the first time, indoor wireless networks are designed and simulated for a minimal exposure dose, where both uplink and downlink are considered. The impact of the minimization is numerically assessed for four scenarios: two WiFi configurations with different throughputs, a Universal Mobile Telecommunications System (UMTS) configuration for phone call traffic, and a Long-Term Evolution (LTE) configuration with a high data rate. Also, the influence of the uplink usage on the total absorbed dose is characterized. Downlink dose reductions of at least 75% are observed when adding more base stations with a lower transmit power. Total dose reductions decrease with increasing uplink usage for WiFi due to the lack of uplink power control but are maintained for LTE and UMTS. Uplink doses become dominant over downlink doses for usages of only a few seconds for WiFi. For UMTS and LTE, an almost continuous uplink usage is required to have a significant effect on the total dose, thanks to the power control mechanism.
Joint Minimization of Uplink and Downlink Whole-Body Exposure Dose in Indoor Wireless Networks
Plets, D.; Joseph, W.; Vanhecke, K.; Vermeeren, G.; Wiart, J.; Aerts, S.; Varsier, N.; Martens, L.
2015-01-01
The total whole-body exposure dose in indoor wireless networks is minimized. For the first time, indoor wireless networks are designed and simulated for a minimal exposure dose, where both uplink and downlink are considered. The impact of the minimization is numerically assessed for four scenarios: two WiFi configurations with different throughputs, a Universal Mobile Telecommunications System (UMTS) configuration for phone call traffic, and a Long-Term Evolution (LTE) configuration with a high data rate. Also, the influence of the uplink usage on the total absorbed dose is characterized. Downlink dose reductions of at least 75% are observed when adding more base stations with a lower transmit power. Total dose reductions decrease with increasing uplink usage for WiFi due to the lack of uplink power control but are maintained for LTE and UMTS. Uplink doses become dominant over downlink doses for usages of only a few seconds for WiFi. For UMTS and LTE, an almost continuous uplink usage is required to have a significant effect on the total dose, thanks to the power control mechanism. PMID:25793213
Øbro, Nina F; Ryder, Lars P; Madsen, Hans O; Andersen, Mette K; Lausen, Birgitte; Hasle, Henrik; Schmiegelow, Kjeld; Marquart, Hanne V
2012-01-01
Reduction in minimal residual disease, measured by real-time quantitative PCR or flow cytometry, predicts prognosis in childhood B-cell precursor acute lymphoblastic leukemia. We explored whether cells reported as minimal residual disease by flow cytometry represent the malignant clone harboring clone-specific genomic markers (53 follow-up bone marrow samples from 28 children with B-cell precursor acute lymphoblastic leukemia). Cell populations (presumed leukemic and non-leukemic) were flow-sorted during standard flow cytometry-based minimal residual disease monitoring and explored by PCR and/or fluorescence in situ hybridization. We found good concordance between flow cytometry and genomic analyses in the individual flow-sorted leukemic (93% true positive) and normal (93% true negative) cell populations. Four cases with discrepant results had plausible explanations (e.g. partly informative immunophenotype and antigen modulation) that highlight important methodological pitfalls. These findings demonstrate that with sufficient experience, flow cytometry is reliable for minimal residual disease monitoring in B-cell precursor acute lymphoblastic leukemia, although rare cases require supplementary PCR-based monitoring.
Krumholz, Harlan M; Hsieh, Angela; Dreyer, Rachel P; Welsh, John; Desai, Nihar R; Dharmarajan, Kumar
2016-01-01
The risk of rehospitalization is elevated in the immediate post-discharge period and declines over time. It is not known if the extent and timing of risk vary across readmission diagnoses, suggesting that recovery and vulnerability after discharge differ by physiologic system. We compared risk trajectories for major readmission diagnoses in the year after discharge among all Medicare fee-for-service beneficiaries hospitalized with heart failure (HF), acute myocardial infarction (AMI), or pneumonia from 2008-2010. We estimated the daily risk of rehospitalization for 12 major readmission diagnostic categories after accounting for the competing risk of death after discharge. For each diagnostic category, we identified (1) the time required for readmission risk to peak and then decline 50% from maximum values after discharge; (2) the time required for readmission risk to approach plateau periods of minimal day-to-day change; and (3) the extent to which hospitalization risks are higher among patients recently discharged from the hospital compared with the general elderly population. Among >3,000,000 hospitalizations, the yearly rate of rehospitalization was 67.0%, 49.5%, and 55.3% after hospitalization for HF, AMI, and pneumonia, respectively. The extent and timing of risk varied by readmission diagnosis and initial admitting condition. Risk of readmission for gastrointestinal bleeding/anemia peaked particularly late after hospital discharge, occurring 10, 6, and 7 days after hospitalization for HF, AMI, and pneumonia, respectively. Risk of readmission for trauma/injury declined particularly slowly, requiring 38, 20, and 38 days to decline by 50% after hospitalization for HF, AMI, and pneumonia, respectively. Patterns of vulnerability to different conditions that cause rehospitalization vary by time after hospital discharge. This finding suggests that recovery of various physiologic systems occurs at different rates and that post-discharge interventions to minimize vulnerability to specific conditions should be tailored to their underlying risks.
Evaluation of tissue interactions with mechanical elements of a transscleral drug delivery device.
Cohen, Sarah J; Chan, Robison V Paul; Keegan, Mark; Andreoli, Christopher M; Borenstein, Jeffrey T; Miller, Joan W; Gragoudas, Evangelos S
2012-03-12
The goal of this work was to evaluate tissue-device interactions due to implantation of a mechanically operated drug delivery system onto the posterior sclera. Two test devices were designed and fabricated to model elements of the drug delivery device-one containing a free-spinning ball bearing and the other encasing two articulating gears. Openings in the base of test devices modeled ports for drug passage from device to sclera. Porous poly(tetrafluoroethylene) (PTFE) membranes were attached to half of the gear devices to minimize tissue ingrowth through these ports. Test devices were sutured onto rabbit eyes for 10 weeks. Tissue-device interactions were evaluated histologically and mechanically after removal to determine effects on device function and changes in surrounding tissue. Test devices were generally well-tolerated during residence in the animal. All devices encouraged fibrous tissue formation between the sclera and the device, fibrous tissue encapsulation and invasion around the device, and inflammation of the conjunctiva. Gear devices encouraged significantly greater inflammation in all cases and a larger rate of tissue ingrowth. PTFE membranes prevented tissue invasion through the covered drug ports, though tissue migrated in through other smaller openings. The torque required to turn the mechanical elements increased over 1000 times for gear devices, but only on the order of 100 times for membrane-covered gear devices and less than 100 times for ball bearing devices. Maintaining a lower device profile, minimizing microscale motion on the eye surface and covering drug ports with a porous membrane may minimize inflammation, decreasing the risk of damage to surrounding tissues and minimizing disruption of device operation.
System for robot-assisted real-time laparoscopic ultrasound elastography
NASA Astrophysics Data System (ADS)
Billings, Seth; Deshmukh, Nishikant; Kang, Hyun Jae; Taylor, Russell; Boctor, Emad M.
2012-02-01
Surgical robots provide many advantages for surgery, including minimal invasiveness, precise motion, high dexterity, and crisp stereovision. One limitation of current robotic procedures, compared to open surgery, is the loss of haptic information for such purposes as palpation, which can be very important in minimally invasive tumor resection. Numerous studies have reported the use of real-time ultrasound elastography, in conjunction with conventional B-mode ultrasound, to differentiate malignant from benign lesions. Several groups (including our own) have reported integration of ultrasound with the da Vinci robot, and ultrasound elastography is a very promising image guidance method for robotassisted procedures that will further enable the role of robots in interventions where precise knowledge of sub-surface anatomical features is crucial. We present a novel robot-assisted real-time ultrasound elastography system for minimally invasive robot-assisted interventions. Our system combines a da Vinci surgical robot with a non-clinical experimental software interface, a robotically articulated laparoscopic ultrasound probe, and our GPU-based elastography system. Elasticity and B-mode ultrasound images are displayed as picture-in-picture overlays in the da Vinci console. Our system minimizes dependence on human performance factors by incorporating computer-assisted motion control that automatically generates the tissue palpation required for elastography imaging, while leaving high-level control in the hands of the user. In addition to ensuring consistent strain imaging, the elastography assistance mode avoids the cognitive burden of tedious manual palpation. Preliminary tests of the system with an elasticity phantom demonstrate the ability to differentiate simulated lesions of varied stiffness and to clearly delineate lesion boundaries.
Determining minimal display element requirements for surface map displays
DOT National Transportation Integrated Search
2003-04-14
There is a great deal of interest in developing electronic surface map displays to enhance safety and reduce incidents and incursions on or near the airport surface. There is a lack of research, however, detailing the minimal display elements require...
Callaham, Michael; John, Leslie K
2018-01-05
We define a minimally important difference for the Likert-type scores frequently used in scientific peer review (similar to existing minimally important differences for scores in clinical medicine). The magnitude of score change required to change editorial decisions has not been studied, to our knowledge. Experienced editors at a journal in the top 6% by impact factor were asked how large a change of rating in "overall desirability for publication" was required to trigger a change in their initial decision on an article. Minimally important differences were assessed twice for each editor: once assessing the rating change required to shift the editor away from an initial decision to accept, and the other assessing the magnitude required to shift away from an initial rejection decision. Forty-one editors completed the survey (89% response rate). In the acceptance frame, the median minimally important difference was 0.4 points on a scale of 1 to 5. Editors required a greater rating change to shift from an initial rejection decision; in the rejection frame, the median minimally important difference was 1.2 points. Within each frame, there was considerable heterogeneity: in the acceptance frame, 38% of editors did not change their decision within the maximum available range; in the rejection frame, 51% did not. To our knowledge, this is the first study to determine the minimally important difference for Likert-type ratings of research article quality, or in fact any nonclinical scientific assessment variable. Our findings may be useful for future research assessing whether changes to the peer review process produce clinically meaningful differences in editorial decisionmaking. Copyright © 2017 American College of Emergency Physicians. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Kesuma, Hendra; Niederkleine, Kris; Schmale, Sebastian; Ahobala, Tejas; Paul, Steffen; Sebald, Johannes
2016-08-01
In this work we design and implement efficient time synchronization/stamping method for Wireless Sensor Network inside the Vehicle Equipment Bay (VEB) of the ARIANE 5. The sensor nodes in the network do not require real time clock (RTC) hardware to store and stamp each measurement data performed by the sensors. There will be only the measurement sequence information, previous time (clock) information, measurement data and its related data protocol information sent back to the Access Point (AP). This lead to less data transmission, less energy and less time required by the sensor nodes to operate and also leads to longer battery life time. The Visible Light Communication (VLC) is used, to provide energy, to synchronize time and to deliver the commands to the sensor nodes in the network. By employing star network topology, a part of solar cell as receiver, the conventional receiver (RF/Infrared) is neglected to reduce amount of hardware and energy consumption. The infrared transmitter on the sensor node is deployed to minimize the electromagnetic interference in the launcher and does not require a complicated circuit in comparison to a RF transmitter.
Ultrasound Picture Archiving And Communication Systems
NASA Astrophysics Data System (ADS)
Koestner, Ken; Hottinger, C. F.
1982-01-01
The ideal ultrasonic image communication and storage system must be flexible in order to optimize speed and minimize storage requirements. Various ultrasonic imaging modalities are quite different in data volume and speed requirements. Static imaging, for example B-Scanning, involves acquisition of a large amount of data that is averaged or accumulated in a desired manner. The image is then frozen in image memory before transfer and storage. Images are commonly a 512 x 512 point array, each point 6 bits deep. Transfer of such an image over a serial line at 9600 baud would require about three minutes. Faster transfer times are possible; for example, we have developed a parallel image transfer system using direct memory access (DMA) that reduces the time to 16 seconds. Data in this format requires 256K bytes for storage. Data compression can be utilized to reduce these requirements. Real-time imaging has much more stringent requirements for speed and storage. The amount of actual data per frame in real-time imaging is reduced due to physical limitations on ultrasound. For example, 100 scan lines (480 points long, 6 bits deep) can be acquired during a frame at a 30 per second rate. In order to transmit and save this data at a real-time rate requires a transfer rate of 8.6 Megabaud. A real-time archiving system would be complicated by the necessity of specialized hardware to interpolate between scan lines and perform desirable greyscale manipulation on recall. Image archiving for cardiology and radiology would require data transfer at this high rate to preserve temporal (cardiology) and spatial (radiology) information.
Time Management in the Operating Room: An Analysis of the Dedicated Minimally Invasive Surgery Suite
Hsiao, Kenneth C.; Machaidze, Zurab
2004-01-01
Background: Dedicated minimally invasive surgery suites are available that contain specialized equipment to facilitate endoscopic surgery. Laparoscopy performed in a general operating room is hampered by the multitude of additional equipment that must be transported into the room. The objective of this study was to compare the preparation times between procedures performed in traditional operating rooms versus dedicated minimally invasive surgery suites to see whether operating room efficiency is improved in the specialized room. Methods: The records of 50 patients who underwent laparoscopic procedures between September 2000 and April 2002 were retrospectively reviewed. Twenty-three patients underwent surgery in a general operating room and 18 patients in an minimally invasive surgery suite. Nine patients were excluded because of cystoscopic procedures undergone prior to laparoscopy. Various time points were recorded from which various time intervals were derived, such as preanesthesia time, anesthesia induction time, and total preparation time. A 2-tailed, unpaired Student t test was used for statistical analysis. Results: The mean preanesthesia time was significantly faster in the minimally invasive surgery suite (12.2 minutes) compared with that in the traditional operating room (17.8 minutes) (P=0.013). Mean anesthesia induction time in the minimally invasive surgery suite (47.5 minutes) was similar to time in the traditional operating room (45.7 minutes) (P=0.734). The average total preparation time for the minimally invasive surgery suite (59.6 minutes) was not significantly faster than that in the general operating room (63.5 minutes) (P=0.481). Conclusion: The amount of time that elapses between the patient entering the room and anesthesia induction is statically shorter in a dedicated minimally invasive surgery suite. Laparoscopic surgery is performed more efficiently in a dedicated minimally invasive surgery suite versus a traditional operating room. PMID:15554269
Riley, William T; Serrano, Katrina J; Nilsen, Wendy; Atienza, Audie A
2015-10-01
Recent advances in mobile and wireless technologies have made real-time assessments of health behaviors and their influences possible with minimal respondent burden. These tech-enabled real-time assessments provide the basis for intensively adaptive interventions (IAIs). Evidence of such studies that adjust interventions based on real-time inputs is beginning to emerge. Although IAIs are promising, the development of intensively adaptive algorithms generate new research questions, and the intensive longitudinal data produced by IAIs require new methodologies and analytic approaches. Research considerations and future directions for IAIs in health behavior research are provided.
[Pharmaceutical logistic in turnover of pharmaceutical products of Azerbaijan].
Dzhalilova, K I
2009-11-01
Development of pharmaceutical logistic system model promotes optimal strategy for pharmaceutical functioning. The goal of such systems is organization of pharmaceutical product's turnover in required quantity and assortment, at preset time and place, at a highest possible degree of consumption readiness with minimal expenses and qualitative service. Organization of the optimal turnover chain in the region is offered to start from approximate classification of medicaments by logistic characteristics. Supplier selection was performed by evaluation of timeliness of delivery, quality of delivered products (according to the minimum acceptable level of quality) and time-keeping of time spending for orders delivery.
Robotic partial nephrectomy for complex renal tumors: surgical technique.
Rogers, Craig G; Singh, Amar; Blatt, Adam M; Linehan, W Marston; Pinto, Peter A
2008-03-01
Laparoscopic partial nephrectomy requires advanced training to accomplish tumor resection and renal reconstruction while minimizing warm ischemia times. Complex renal tumors add an additional challenge to a minimally invasive approach to nephron-sparing surgery. We describe our technique, illustrated with video, of robotic partial nephrectomy for complex renal tumors, including hilar, endophytic, and multiple tumors. Robotic assistance was used to resect 14 tumors in eight patients (mean age: 50.3 yr; range: 30-68 yr). Three patients had hereditary kidney cancer. All patients had complex tumor features, including hilar tumors (n=5), endophytic tumors (n=4), and/or multiple tumors (n=3). Robotic partial nephrectomy procedures were performed successfully without complications. Hilar clamping was used with a mean warm ischemia time of 31 min (range: 24-45 min). Mean blood loss was 230 ml (range: 100-450 ml). Histopathology confirmed clear-cell renal cell carcinoma (n=3), hybrid oncocytic tumor (n=2), chromophobe renal cell carcinoma (n=2), and oncocytoma (n=1). All patients had negative surgical margins. Mean index tumor size was 3.6 cm (range: 2.6-6.4 cm). Mean hospital stay was 2.6 d. At 3-mo follow-up, no patients experienced a statistically significant change in serum creatinine or estimated glomerular filtration rate and there was no evidence of tumor recurrence. Robotic partial nephrectomy is safe and feasible for select patients with complex renal tumors, including hilar, endophytic, and multiple tumors. Robotic assistance may facilitate a minimally invasive, nephron-sparing approach for select patients with complex renal tumors who might otherwise require open surgery or total nephrectomy.
Differences in care burden of patients undergoing dialysis in different centres in the netherlands.
de Kleijn, Ria; Uyl-de Groot, Carin; Hagen, Chris; Diepenbroek, Adry; Pasker-de Jong, Pieternel; Ter Wee, Piet
2017-06-01
A classification model was developed to simplify planning of personnel at dialysis centres. This model predicted the care burden based on dialysis characteristics. However, patient characteristics and different dialysis centre categories might also influence the amount of care time required. To determine if there is a difference in care burden between different categories of dialysis centres and if specific patient characteristics predict nursing time needed for patient treatment. An observational study. Two hundred and forty-two patients from 12 dialysis centres. In 12 dialysis centres, nurses filled out the classification list per patient and completed a form with patient characteristics. Nephrologists filled out the Charlson Comorbidity Index. Independent observers clocked the time nurses spent on separate steps of the dialysis for each patient. Dialysis centres were categorised into four types. Data were analysed using regression models. In contrast to other dialysis centres, academic centres needed 14 minutes more care time per patient per dialysis treatment than predicted in the classification model. No patient characteristics were found that influenced this difference. The only patient characteristic that predicted the time required was gender, with more time required to treat women. Gender did not affect the difference between measured and predicted care time. Differences in care burden were observed between academic and other centres, with more time required for treatment in academic centres. Contribution of patient characteristics to the time difference was minimal. The only patient characteristics that predicted care time were previous transplantation, which reduced the time required, and gender, with women requiring more care time. © 2017 European Dialysis and Transplant Nurses Association/European Renal Care Association.
5 CFR 582.203 - Information minimally required to accompany legal process.
Code of Federal Regulations, 2012 CFR
2012-01-01
... accompany legal process. 582.203 Section 582.203 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT CIVIL SERVICE REGULATIONS COMMERCIAL GARNISHMENT OF FEDERAL EMPLOYEES' PAY Service of Legal Process § 582.203 Information minimally required to accompany legal process. (a) Sufficient identifying...
5 CFR 582.203 - Information minimally required to accompany legal process.
Code of Federal Regulations, 2010 CFR
2010-01-01
... accompany legal process. 582.203 Section 582.203 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT CIVIL SERVICE REGULATIONS COMMERCIAL GARNISHMENT OF FEDERAL EMPLOYEES' PAY Service of Legal Process § 582.203 Information minimally required to accompany legal process. (a) Sufficient identifying...
5 CFR 582.203 - Information minimally required to accompany legal process.
Code of Federal Regulations, 2011 CFR
2011-01-01
... accompany legal process. 582.203 Section 582.203 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT CIVIL SERVICE REGULATIONS COMMERCIAL GARNISHMENT OF FEDERAL EMPLOYEES' PAY Service of Legal Process § 582.203 Information minimally required to accompany legal process. (a) Sufficient identifying...
5 CFR 582.203 - Information minimally required to accompany legal process.
Code of Federal Regulations, 2014 CFR
2014-01-01
... accompany legal process. 582.203 Section 582.203 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT CIVIL SERVICE REGULATIONS COMMERCIAL GARNISHMENT OF FEDERAL EMPLOYEES' PAY Service of Legal Process § 582.203 Information minimally required to accompany legal process. (a) Sufficient identifying...
5 CFR 582.203 - Information minimally required to accompany legal process.
Code of Federal Regulations, 2013 CFR
2013-01-01
... accompany legal process. 582.203 Section 582.203 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT CIVIL SERVICE REGULATIONS COMMERCIAL GARNISHMENT OF FEDERAL EMPLOYEES' PAY Service of Legal Process § 582.203 Information minimally required to accompany legal process. (a) Sufficient identifying...
Code of Federal Regulations, 2012 CFR
2012-07-01
... 1,000 hours of operation or annually, whichever comes first; 1 Minimize the engine's time spent at... apply. b. Inspect air cleaner every 1,000 hours of operation or annually, whichever comes first;c. Inspect all hoses and belts every 500 hours of operation or annually, whichever comes first, and replace...
12 strategies for managing capital projects.
Stoudt, Richard L
2013-05-01
To reduce the amount of time and cost associated with capital projects, healthcare leaders should: Begin the project with a clear objective and a concise master facilities plan. Select qualified team members who share the vision of the owner. Base the size of the project on a conservative business plan. Minimize incremental program requirements. Evaluate the cost impact of the building footprint. Consider alternative delivery methods.
Straight-Pore Microfilter with Efficient Regeneration
NASA Technical Reports Server (NTRS)
Liu, Han; LaConti, Anthony B.; McCallum. Thomas J.; Schmitt, Edwin W.
2010-01-01
A novel, high-efficiency gas particulate filter has precise particle size screening, low pressure drop, and a simple and fast regeneration process. The regeneration process, which requires minimal material and energy consumption, can be completely automated, and the filtration performance can be restored within a very short period of time. This filter is of a novel material composite that contains the support structure and a novel coating.
Psychological adaptation of nurses post-disaster.
Waters, K A; Selander, J; Stuart, G W
1992-01-01
Disasters have the potential to cause major disruptions in lifeline services and family support systems. As caregivers, nurses are required to make difficult choices during national emergencies and may be at risk for experiencing psychological distress following a disaster. This study describes the responses of a group of nurses following Hurricane Hugo, and makes recommendations to minimize the stress placed on nurses working in a time of disaster.
USDA-ARS?s Scientific Manuscript database
Time-temperature control of fresh-cut produce at 41 °F (5 ºC) or less can significantly reduce the growth of human pathogens. Since 2009, the FDA Food Code has required that packaged ready-to-eat leafy greens be kept at 41 °F (5 ºC) or lower to minimize the potential of pathogen proliferation in the...
Dutta, Soumita
2017-01-01
ABSTRACT The unicellular green alga Chlamydomonas reinhardtii is an ideal model organism for studies of ciliary function and assembly. In assays for biological and biochemical effects of various factors on flagellar structure and function, synchronous culture is advantageous for minimizing variability. Here, we have characterized a method in which 100% synchronization is achieved with respect to flagellar length but not with respect to the cell cycle. The method requires inducing flagellar regeneration by amputation of the entire cell population and limiting regeneration time. This results in a maximally homogeneous distribution of flagellar lengths at 3 h postamputation. We found that time-limiting new protein synthesis during flagellar synchronization limits variability in the unassembled pool of limiting flagellar protein and variability in flagellar length without affecting the range of cell volumes. We also found that long- and short-flagella mutants that regenerate normally require longer and shorter synchronization times, respectively. By minimizing flagellar length variability using a simple method requiring only hours and no changes in media, flagellar synchronization facilitates the detection of small changes in flagellar length resulting from both chemical and genetic perturbations in Chlamydomonas. This method increases our ability to probe the basic biology of ciliary size regulation and related disease etiologies. IMPORTANCE Cilia and flagella are highly conserved antenna-like organelles that found in nearly all mammalian cell types. They perform sensory and motile functions contributing to numerous physiological and developmental processes. Defects in their assembly and function are implicated in a wide range of human diseases ranging from retinal degeneration to cancer. Chlamydomonas reinhardtii is an algal model system for studying mammalian cilium formation and function. Here, we report a simple synchronization method that allows detection of small changes in ciliary length by minimizing variability in the population. We find that this method alters the key relationship between cell size and the amount of protein accumulated for flagellar growth. This provides a rapid alternative to traditional methods of cell synchronization for uncovering novel regulators of cilia. PMID:28289724
Dutta, Soumita; Avasthi, Prachee
2017-01-01
The unicellular green alga Chlamydomonas reinhardtii is an ideal model organism for studies of ciliary function and assembly. In assays for biological and biochemical effects of various factors on flagellar structure and function, synchronous culture is advantageous for minimizing variability. Here, we have characterized a method in which 100% synchronization is achieved with respect to flagellar length but not with respect to the cell cycle. The method requires inducing flagellar regeneration by amputation of the entire cell population and limiting regeneration time. This results in a maximally homogeneous distribution of flagellar lengths at 3 h postamputation. We found that time-limiting new protein synthesis during flagellar synchronization limits variability in the unassembled pool of limiting flagellar protein and variability in flagellar length without affecting the range of cell volumes. We also found that long- and short-flagella mutants that regenerate normally require longer and shorter synchronization times, respectively. By minimizing flagellar length variability using a simple method requiring only hours and no changes in media, flagellar synchronization facilitates the detection of small changes in flagellar length resulting from both chemical and genetic perturbations in Chlamydomonas . This method increases our ability to probe the basic biology of ciliary size regulation and related disease etiologies. IMPORTANCE Cilia and flagella are highly conserved antenna-like organelles that found in nearly all mammalian cell types. They perform sensory and motile functions contributing to numerous physiological and developmental processes. Defects in their assembly and function are implicated in a wide range of human diseases ranging from retinal degeneration to cancer. Chlamydomonas reinhardtii is an algal model system for studying mammalian cilium formation and function. Here, we report a simple synchronization method that allows detection of small changes in ciliary length by minimizing variability in the population. We find that this method alters the key relationship between cell size and the amount of protein accumulated for flagellar growth. This provides a rapid alternative to traditional methods of cell synchronization for uncovering novel regulators of cilia.
Targeting Low-Energy Ballistic Lunar Transfers
NASA Technical Reports Server (NTRS)
Parker, Jeffrey S.
2010-01-01
Numerous low-energy ballistic transfers exist between the Earth and Moon that require less fuel than conventional transfers, but require three or more months of transfer time. An entirely ballistic lunar transfer departs the Earth from a particular declination at some time in order to arrive at the Moon at a given time along a desirable approach. Maneuvers may be added to the trajectory in order to adjust the Earth departure to meet mission requirements. In this paper, we characterize the (Delta)V cost required to adjust a low-energy ballistic lunar transfer such that a spacecraft may depart the Earth at a desirable declination, e.g., 28.5(white bullet), on a designated date. This study identifies the optimal locations to place one or two maneuvers along a transfer to minimize the (Delta)V cost of the transfer. One practical application of this study is to characterize the launch period for a mission that aims to launch from a particular launch site, such as Cape Canaveral, Florida, and arrive at a particular orbit at the Moon on a given date using a three-month low-energy transfer.
Advanced data acquisition and display techniques for laser velocimetry
NASA Technical Reports Server (NTRS)
Kjelgaard, Scott O.; Weston, Robert P.
1991-01-01
The Basic Aerodynamics Research Tunnel (BART) has been equipped with state-of-the-art instrumentation for acquiring the data needed for code validation. This paper describes the three-component LDV and the workstation-based data-acquisition system (DAS) which has been developed for the BART. The DAS allows the use of automation and the quick integration of advanced instrumentation, while minimizing the software development time required between investigations. The paper also includes a description of a graphics software library developed to support the windowing environment of the DAS. The real-time displays generated using the graphics library help the researcher ensure the test is proceeding properly. The graphics library also supports the requirements of posttest data analysis. The use of the DAS and graphics libraries is illustrated by presenting examples of the real-time and postprocessing display graphics for LDV investigations.
Alor-Hernández, Giner; Sánchez-Cervantes, José Luis; Juárez-Martínez, Ulises; Posada-Gómez, Rubén; Cortes-Robles, Guillermo; Aguilar-Laserre, Alberto
2012-03-01
Emergency healthcare is one of the emerging application domains for information services, which requires highly multimodal information services. The time of consuming pre-hospital emergency process is critical. Therefore, the minimization of required time for providing primary care and consultation to patients is one of the crucial factors when trying to improve the healthcare delivery in emergency situations. In this sense, dynamic location of medical entities is a complex process that needs time and it can be critical when a person requires medical attention. This work presents a multimodal location-based system for locating and assigning medical entities called ITOHealth. ITOHealth provides a multimodal middleware-oriented integrated architecture using a service-oriented architecture in order to provide information of medical entities in mobile devices and web browsers with enriched interfaces providing multimodality support. ITOHealth's multimodality is based on the use of Microsoft Agent Characters, the integration of natural language voice to the characters, and multi-language and multi-characters support providing an advantage for users with visual impairments.
NASA Technical Reports Server (NTRS)
Whitley, Ryan J.; Jedrey, Richard; Landau, Damon; Ocampo, Cesar
2015-01-01
Mars flyby trajectories and Earth return trajectories have the potential to enable lower- cost and sustainable human exploration of Mars. Flyby and return trajectories are true minimum energy paths with low to zero post-Earth departure maneuvers. By emplacing the large crew vehicles required for human transit on these paths, the total fuel cost can be reduced. The traditional full-up repeating Earth-Mars-Earth cycler concept requires significant infrastructure, but a Mars only flyby approach minimizes mission mass and maximizes opportunities to build-up missions in a stepwise manner. In this paper multiple strategies for sending a crew of 4 to Mars orbit and back are examined. With pre-emplaced assets in Mars orbit, a transit habitat and a minimally functional Mars taxi, a complete Mars mission can be accomplished in 3 SLS launches and 2 Mars Flyby's, including Orion. While some years are better than others, ample opportunities exist within a given 15-year Earth-Mars alignment cycle. Building up a mission cadence over time, this approach can translate to Mars surface access. Risk reduction, which is always a concern for human missions, is mitigated by the use of flybys with Earth return (some of which are true free returns) capability.
Bark, David L.; Vahabi, Hamed; Bui, Hieu; Movafaghi, Sanli; Moore, Brandon; Kota, Arun K.; Popat, Ketul; Dasi, Lakshmi P.
2016-01-01
In this study, we explore how blood-material interactions and hemodynamics are impacted by rendering a clinical quality 25 mm St. Jude Medical Bileaflet mechanical heart valve (BMHV) superhydrophobic (SH) with the aim of reducing thrombo-embolic complications associated with BMHVs. Basic cell adhesion is evaluated to assess blood-material interactions, while hemodynamic performance is analyzed with and without the SH coating. Results show that a SH coating with a receding contact angle (CA) of 160º strikingly eliminates platelet and leukocyte adhesion to the surface. Alternatively, many platelets attach to and activate on pyrolytic carbon (receding CA=47), the base material for BMHVs. We further show that the performance index increases by 2.5% for coated valve relative to an uncoated valve, with a maximum possible improved performance of 5%. Both valves exhibit instantaneous shear stress below 10 N/m2 and Reynolds Shear Stress below 100 N/m2. Therefore, a SH BMHV has the potential to relax the requirement for antiplatelet and anticoagulant drug regimens typically required for patients receiving MHVs by minimizing blood-material interactions, while having a minimal impact on hemodynamics. We show for the first time that SH-coated surfaces may be a promising direction to minimize thrombotic complications in complex devices such as heart valves. PMID:27098219
Orbital transfer vehicle launch operations study. Volume 2: Detailed summary
NASA Technical Reports Server (NTRS)
1986-01-01
A series of Operational Design Drivers were identified. Several of these could have significant impact(s) on program costs. These recommendations, for example, include such items as: complete factory assembly and checkout prior to shipment to the ground launch site to make significant reductions in time required at the launch site as well as overall manpower required to do this work; minimize use of nonstandard equipment when orbiter provided equipment is available; and require commonality (or interchangeability) of subsystem equipment elements that are common to the space station, Orbit Maneuvering Vehicles, and/or Orbit Transfer Vehicles. Several additional items were identified that will require a significant amount of management attention (and direction) to resolve. Key elements of the space based processing plans are discussed.
Occult traumatic hemothorax: when can sleeping dogs lie?
Bilello, John F; Davis, James W; Lemaster, Deborah M
2005-12-01
Size of traumatic occult hemothorax on admission requiring drainage has not been defined. Computed axial tomography (CAT) may guide drainage criteria. A retrospective review of patients with hemothoraces on CAT was performed. Extrapolating previously described methods of pleural fluid measurement, hemothoraces were quantified using the fluid stripe in the dependent pleural "gutter." Data included patient age, injury severity, and intervention (thoracentesis or tube thoracostomy). Seventy-eight patients with 99 occult hemothoraces met the criteria for study inclusion: 52 hemothoraces qualified as "minimal" and 47 as "moderate/large." Eight patients (15%) in the minimal group and 31 patients (66%) in the moderate/large group underwent intervention (P < .001). There was no difference in patient age, injury severity, ventilator requirement, or presence of pulmonary contusion. CAT in stable blunt-trauma patients can predict which patients with occult hemothorax are likely to undergo intervention. Patients with hemothorax > or = 1.5 cm on CAT were 4 times more likely to undergo drainage intervention compared with those having hemothorax < 1.5 cm.
Robust Control Design for Systems With Probabilistic Uncertainty
NASA Technical Reports Server (NTRS)
Crespo, Luis G.; Kenny, Sean P.
2005-01-01
This paper presents a reliability- and robustness-based formulation for robust control synthesis for systems with probabilistic uncertainty. In a reliability-based formulation, the probability of violating design requirements prescribed by inequality constraints is minimized. In a robustness-based formulation, a metric which measures the tendency of a random variable/process to cluster close to a target scalar/function is minimized. A multi-objective optimization procedure, which combines stability and performance requirements in time and frequency domains, is used to search for robustly optimal compensators. Some of the fundamental differences between the proposed strategy and conventional robust control methods are: (i) unnecessary conservatism is eliminated since there is not need for convex supports, (ii) the most likely plants are favored during synthesis allowing for probabilistic robust optimality, (iii) the tradeoff between robust stability and robust performance can be explored numerically, (iv) the uncertainty set is closely related to parameters with clear physical meaning, and (v) compensators with improved robust characteristics for a given control structure can be synthesized.
Chemical Detection and Identification Techniques for Exobiology Flight Experiments
NASA Technical Reports Server (NTRS)
Kojiro, Daniel R.; Sheverev, Valery A.; Khromov, Nikolai A.
2002-01-01
Exobiology flight experiments require highly sensitive instrumentation for in situ analysis of the volatile chemical species that occur in the atmospheres and surfaces of various bodies within the solar system. The complex mixtures encountered place a heavy burden on the analytical Instrumentation to detect and identify all species present. The minimal resources available onboard for such missions mandate that the instruments provide maximum analytical capabilities with minimal requirements of volume, weight and consumables. Advances in technology may be achieved by increasing the amount of information acquired by a given technique with greater analytical capabilities and miniaturization of proven terrestrial technology. We describe here methods to develop analytical instruments for the detection and identification of a wide range of chemical species using Gas Chromatography. These efforts to expand the analytical capabilities of GC technology are focused on the development of detectors for the GC which provide sample identification independent of the GC retention time data. A novel new approach employs Penning Ionization Electron Spectroscopy (PIES).
NASA Technical Reports Server (NTRS)
Sadovsky, A. V.; Davis, D.; Isaacson, D. R.
2012-01-01
We address the problem of navigating a set of moving agents, e.g. automated guided vehicles, through a transportation network so as to bring each agent to its destination at a specified time. Each pair of agents is required to be separated by a minimal distance, generally agent-dependent, at all times. The speed range, initial position, required destination, and required time of arrival at destination for each agent are assumed provided. The movement of each agent is governed by a controlled differential equation (state equation). The problem consists in choosing for each agent a path and a control strategy so as to meet the constraints and reach the destination at the required time. This problem arises in various fields of transportation, including Air Traffic Management and train coordination, and in robotics. The main contribution of the paper is a model that allows to recast this problem as a decoupled collection of problems in classical optimal control and is easily generalized to the case when inertia cannot be neglected. Some qualitative insight into solution behavior is obtained using the Pontryagin Maximum Principle. Sample numerical solutions are computed using a numerical optimal control solver.
2010-01-01
Background Numerous pen devices are available to administer recombinant Human Growth Hormone (rhGH), and both patients and health plans have varying issues to consider when selecting a particular product and device for daily use. Therefore, the present study utilized multi-dimensional product analysis to assess potential time involvement, required weekly administration steps, and utilization costs relative to daily rhGH administration. Methods Study objectives were to conduct 1) Time-and-Motion (TM) simulations in a randomized block design that allowed time and steps comparisons related to rhGH preparation, administration and storage, and 2) a Cost Minimization Analysis (CMA) relative to opportunity and supply costs. Nurses naïve to rhGH administration and devices were recruited to evaluate four rhGH pen devices (2 in liquid form, 2 requiring reconstitution) via TM simulations. Five videotaped and timed trials for each product were evaluated based on: 1) Learning (initial use instructions), 2) Preparation (arrange device for use), 3) Administration (actual simulation manikin injection), and 4) Storage (maintain product viability between doses), in addition to assessment of steps required for weekly use. The CMA applied micro-costing techniques related to opportunity costs for caregivers (categorized as wages), non-drug medical supplies, and drug product costs. Results Norditropin® NordiFlex and Norditropin® NordiPen (NNF and NNP, Novo Nordisk, Inc., Bagsværd, Denmark) took less weekly Total Time (p < 0.05) to use than either of the comparator products, Genotropin® Pen (GTP, Pfizer, Inc, New York, New York) or HumatroPen® (HTP, Eli Lilly and Company, Indianapolis, Indiana). Time savings were directly related to differences in new package Preparation times (NNF (1.35 minutes), NNP (2.48 minutes) GTP (4.11 minutes), HTP (8.64 minutes), p < 0.05)). Administration and Storage times were not statistically different. NNF (15.8 minutes) and NNP (16.2 minutes) also took less time to Learn than HTP (24.0 minutes) and GTP (26.0 minutes), p < 0.05). The number of weekly required administration steps was also least with NNF and NNP. Opportunity cost savings were greater in devices that were easier to prepare for use; GTP represented an 11.8% drug product savings over NNF, NNP and HTP at time of study. Overall supply costs represented <1% of drug costs for all devices. Conclusions Time-and-motion simulation data used to support a micro-cost analysis demonstrated that the pen device with the greater time demand has highest net costs. PMID:20377905
Nickman, Nancy A; Haak, Sandra W; Kim, Jaewhan
2010-04-08
Numerous pen devices are available to administer recombinant Human Growth Hormone (rhGH), and both patients and health plans have varying issues to consider when selecting a particular product and device for daily use. Therefore, the present study utilized multi-dimensional product analysis to assess potential time involvement, required weekly administration steps, and utilization costs relative to daily rhGH administration. Study objectives were to conduct 1) Time-and-Motion (TM) simulations in a randomized block design that allowed time and steps comparisons related to rhGH preparation, administration and storage, and 2) a Cost Minimization Analysis (CMA) relative to opportunity and supply costs. Nurses naïve to rhGH administration and devices were recruited to evaluate four rhGH pen devices (2 in liquid form, 2 requiring reconstitution) via TM simulations. Five videotaped and timed trials for each product were evaluated based on: 1) Learning (initial use instructions), 2) Preparation (arrange device for use), 3) Administration (actual simulation manikin injection), and 4) Storage (maintain product viability between doses), in addition to assessment of steps required for weekly use. The CMA applied micro-costing techniques related to opportunity costs for caregivers (categorized as wages), non-drug medical supplies, and drug product costs. Norditropin(R) NordiFlex and Norditropin(R) NordiPen (NNF and NNP, Novo Nordisk, Inc., Bagsvaerd, Denmark) took less weekly Total Time (p < 0.05) to use than either of the comparator products, Genotropin(R) Pen (GTP, Pfizer, Inc, New York, New York) or HumatroPen(R) (HTP, Eli Lilly and Company, Indianapolis, Indiana). Time savings were directly related to differences in new package Preparation times (NNF (1.35 minutes), NNP (2.48 minutes) GTP (4.11 minutes), HTP (8.64 minutes), p < 0.05)). Administration and Storage times were not statistically different. NNF (15.8 minutes) and NNP (16.2 minutes) also took less time to Learn than HTP (24.0 minutes) and GTP (26.0 minutes), p < 0.05). The number of weekly required administration steps was also least with NNF and NNP. Opportunity cost savings were greater in devices that were easier to prepare for use; GTP represented an 11.8% drug product savings over NNF, NNP and HTP at time of study. Overall supply costs represented <1% of drug costs for all devices. Time-and-motion simulation data used to support a micro-cost analysis demonstrated that the pen device with the greater time demand has highest net costs.
Mission Adaptive Uas Capabilities for Earth Science and Resource Assessment
NASA Astrophysics Data System (ADS)
Dunagan, S.; Fladeland, M.; Ippolito, C.; Knudson, M.; Young, Z.
2015-04-01
Unmanned aircraft systems (UAS) are important assets for accessing high risk airspace and incorporate technologies for sensor coordination, onboard processing, tele-communication, unconventional flight control, and ground based monitoring and optimization. These capabilities permit adaptive mission management in the face of complex requirements and chaotic external influences. NASA Ames Research Center has led a number of Earth science remote sensing missions directed at the assessment of natural resources and here we describe two resource mapping problems having mission characteristics requiring a mission adaptive capability extensible to other resource assessment challenges. One example involves the requirement for careful control over solar angle geometry for passive reflectance measurements. This constraint exists when collecting imaging spectroscopy data over vegetation for time series analysis or for the coastal ocean where solar angle combines with sea state to produce surface glint that can obscure the signal. Furthermore, the primary flight control imperative to minimize tracking error should compromise with the requirement to minimize aircraft motion artifacts in the spatial measurement distribution. A second example involves mapping of natural resources in the Earth's crust using precision magnetometry. In this case the vehicle flight path must be oriented to optimize magnetic flux gradients over a spatial domain having continually emerging features, while optimizing the efficiency of the spatial mapping task. These requirements were highlighted in recent Earth Science missions including the OCEANIA mission directed at improving the capability for spectral and radiometric reflectance measurements in the coastal ocean, and the Surprise Valley Mission directed at mapping sub-surface mineral composition and faults, using high-sensitivity magnetometry. This paper reports the development of specific aircraft control approaches to incorporate the unusual and demanding requirements to manage solar angle, aircraft attitude and flight path orientation, and efficient (directly geo-rectified) surface and sub-surface mapping, including the near-time optimization of these sometimes competing requirements.
Forcino, Frank L; Leighton, Lindsey R; Twerdy, Pamela; Cahill, James F
2015-01-01
Community ecologists commonly perform multivariate techniques (e.g., ordination, cluster analysis) to assess patterns and gradients of taxonomic variation. A critical requirement for a meaningful statistical analysis is accurate information on the taxa found within an ecological sample. However, oversampling (too many individuals counted per sample) also comes at a cost, particularly for ecological systems in which identification and quantification is substantially more resource consuming than the field expedition itself. In such systems, an increasingly larger sample size will eventually result in diminishing returns in improving any pattern or gradient revealed by the data, but will also lead to continually increasing costs. Here, we examine 396 datasets: 44 previously published and 352 created datasets. Using meta-analytic and simulation-based approaches, the research within the present paper seeks (1) to determine minimal sample sizes required to produce robust multivariate statistical results when conducting abundance-based, community ecology research. Furthermore, we seek (2) to determine the dataset parameters (i.e., evenness, number of taxa, number of samples) that require larger sample sizes, regardless of resource availability. We found that in the 44 previously published and the 220 created datasets with randomly chosen abundances, a conservative estimate of a sample size of 58 produced the same multivariate results as all larger sample sizes. However, this minimal number varies as a function of evenness, where increased evenness resulted in increased minimal sample sizes. Sample sizes as small as 58 individuals are sufficient for a broad range of multivariate abundance-based research. In cases when resource availability is the limiting factor for conducting a project (e.g., small university, time to conduct the research project), statistically viable results can still be obtained with less of an investment.
Minimally invasive video-assisted thyroid surgery: how can we improve the learning curve?
Castagnola, G; Giulii Cappone, M; Tierno, S M; Mezzetti, G; Centanini, F; Vetrone, I; Bellotti, C
2012-10-01
Minimally invasive video-assisted thyroidectomy (MIVAT) is a technically demanding procedure and requires a surgical team skilled in both endocrine and endoscopic surgery. A time consuming learning and training period is mandatory at the beginning of the experience. The aim of our report is to focus some aspects of the learning curve of the surgeon who practices video-assisted thyroid procedures for the first time, through the analysis of our preliminary series of 36 cases. From September 2004 to April 2005 we selected 36 patients for minimally invasive video-assisted surgery of the thyroid. The patients were considered eligible if they presented with a nodule not exceeding 35 mm in maximum diameter; total thyroid volume within normal range; absence of biochemical and echographic signs of thyroiditis. We analyzed surgical results, conversion rate, operating time, post-operative complications, hospital stay, cosmetic outcome of the series. We performed 36 total thyroidectomy. The procedure was successfully carried out in 33/36 cases. Post-operative complications included 3 transient recurrent nerve palsies and 2 transient hypocalcemias; no definitive hypoparathyroidism was registered. All patients were discharged 2 days after operation. The cosmetic result was considered excellent by most patients. Advances in skills and technology have enabled surgeons to reproduce most open surgical techniques with video-assistance or laparoscopically. Training is essential to acquire any new surgical technique and it should be organized in detail to exploit it completely.
Loehfelm, Thomas W; Prater, Adam B; Debebe, Tequam; Sekhar, Aarti K
2017-02-01
We digitized the radiography teaching file at Black Lion Hospital (Addis Ababa, Ethiopia) during a recent trip, using a standard digital camera and a fluorescent light box. Our goal was to photograph every radiograph in the existing library while optimizing the final image size to the maximum resolution of a high quality tablet computer, preserving the contrast resolution of the radiographs, and minimizing total library file size. A secondary important goal was to minimize the cost and time required to take and process the images. Three workers were able to efficiently remove the radiographs from their storage folders, hang them on the light box, operate the camera, catalog the image, and repack the radiographs back to the storage folder. Zoom, focal length, and film speed were fixed, while aperture and shutter speed were manually adjusted for each image, allowing for efficiency and flexibility in image acquisition. Keeping zoom and focal length fixed, which kept the view box at the same relative position in all of the images acquired during a single photography session, allowed unused space to be batch-cropped, saving considerable time in post-processing, at the expense of final image resolution. We present an analysis of the trade-offs in workflow efficiency and final image quality, and demonstrate that a few people with minimal equipment can efficiently digitize a teaching file library.
Adequacy of depression treatment among college students in the United States.
Eisenberg, Daniel; Chung, Henry
2012-01-01
There is no published evidence on the adequacy of depression care among college students and how this varies by subpopulations and provider types. We estimated the prevalence of minimally adequate treatment among students with significant past-year depressive symptoms. Data were collected via a confidential online survey of a random sample of 8488 students from 15 colleges and universities in the 2009 Healthy Minds Study. Depressive symptoms were assessed by the Patient Health Questionnaire-2, adapted to a past-year time frame. Students with probable depression were coded as having received minimally adequate depression care based on the criteria from Wang and colleagues (2005). Minimally adequate treatment was received by only 22% of depressed students. The likelihood of minimally adequate treatment was similarly low for both psychiatric medication and psychotherapy. Minimally adequate care was lower for students prescribed medication by a primary care provider as compared to a psychiatrist (P<.01). Racial/ethnic minority students were less likely to receive depression care (P<.01). Adequacy of depression care is a significant problem in the college population. Solutions will likely require greater availability of psychiatry care, better coordination between specialty and primary care using collaborative care models, and increased efforts to retain students in psychotherapy. Copyright © 2012 Elsevier Inc. All rights reserved.
5 CFR 581.203 - Information minimally required to accompany legal process.
Code of Federal Regulations, 2014 CFR
2014-01-01
... accompany legal process. 581.203 Section 581.203 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT... Process § 581.203 Information minimally required to accompany legal process. (a) Sufficient identifying information must accompany the legal process in order to enable processing by the governmental entity named...
5 CFR 581.203 - Information minimally required to accompany legal process.
Code of Federal Regulations, 2011 CFR
2011-01-01
... accompany legal process. 581.203 Section 581.203 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT... Process § 581.203 Information minimally required to accompany legal process. (a) Sufficient identifying information must accompany the legal process in order to enable processing by the governmental entity named...
5 CFR 581.203 - Information minimally required to accompany legal process.
Code of Federal Regulations, 2013 CFR
2013-01-01
... accompany legal process. 581.203 Section 581.203 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT... Process § 581.203 Information minimally required to accompany legal process. (a) Sufficient identifying information must accompany the legal process in order to enable processing by the governmental entity named...
5 CFR 581.203 - Information minimally required to accompany legal process.
Code of Federal Regulations, 2012 CFR
2012-01-01
... accompany legal process. 581.203 Section 581.203 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT... Process § 581.203 Information minimally required to accompany legal process. (a) Sufficient identifying information must accompany the legal process in order to enable processing by the governmental entity named...
5 CFR 581.203 - Information minimally required to accompany legal process.
Code of Federal Regulations, 2010 CFR
2010-01-01
... accompany legal process. 581.203 Section 581.203 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT... Process § 581.203 Information minimally required to accompany legal process. (a) Sufficient identifying information must accompany the legal process in order to enable processing by the governmental entity named...
A workstation-based evaluation of a far-field route planner for helicopters
NASA Technical Reports Server (NTRS)
Warner, David N., Jr.; Moran, Francis J.
1991-01-01
Helicopter flight missions at very low, nap of the Earth, altitudes place a heavy workload on the pilot. To aid in reducing this workload, Ames Research Center has been investigating various types of automated route planners. As part of an automated preflight mission planner, a route planner algorithm aids in selecting the overall (far-field) route to be flown. During the mission, the route planner can be used to replan a new route in case of unexpected threats or change in mission requirements. An evaluation of a candidate route planning algorithm, based on dynamic programming techniques is described. This algorithm meets most of the requirements for route planning, both preflight and during the mission. In general, the requirements are to minimize the distance and/or fuel and the deviation from a flight time schedule, and must be flyable within the constraints of available fuel and time.
Extracellular space preservation aids the connectomic analysis of neural circuits.
Pallotto, Marta; Watkins, Paul V; Fubara, Boma; Singer, Joshua H; Briggman, Kevin L
2015-12-09
Dense connectomic mapping of neuronal circuits is limited by the time and effort required to analyze 3D electron microscopy (EM) datasets. Algorithms designed to automate image segmentation suffer from substantial error rates and require significant manual error correction. Any improvement in segmentation error rates would therefore directly reduce the time required to analyze 3D EM data. We explored preserving extracellular space (ECS) during chemical tissue fixation to improve the ability to segment neurites and to identify synaptic contacts. ECS preserved tissue is easier to segment using machine learning algorithms, leading to significantly reduced error rates. In addition, we observed that electrical synapses are readily identified in ECS preserved tissue. Finally, we determined that antibodies penetrate deep into ECS preserved tissue with only minimal permeabilization, thereby enabling correlated light microscopy (LM) and EM studies. We conclude that preservation of ECS benefits multiple aspects of the connectomic analysis of neural circuits.
Study of V/STOL aircraft implementation. Volume 1: Summary
NASA Technical Reports Server (NTRS)
Portenier, W. J.; Webb, H. M.
1973-01-01
A high density short haul air market which by 1980 is large enough to support the introduction of an independent short haul air transportation system is discussed. This system will complement the existing air transportation system and will provide relief of noise and congestion problems at conventional airports. The study has found that new aircraft, exploiting V/STOL and quiet engine technology, can be available for implementing these new services, and they can operate from existing reliever and general aviation airports. The study has also found that the major funding requirements for implementing new short haul services could be borne by private capital, and that the government funding requirement would be minimal and/or recovered through the airline ticket tax. In addition, a suitable new short haul aircraft would have a market potential for $3.5 billion in foreign sales. The long lead times needed for aircraft and engine technology development will require timely actions by federal agencies.
Júnez-Ferreira, H E; Herrera, G S
2013-04-01
This paper presents a new methodology for the optimal design of space-time hydraulic head monitoring networks and its application to the Valle de Querétaro aquifer in Mexico. The selection of the space-time monitoring points is done using a static Kalman filter combined with a sequential optimization method. The Kalman filter requires as input a space-time covariance matrix, which is derived from a geostatistical analysis. A sequential optimization method that selects the space-time point that minimizes a function of the variance, in each step, is used. We demonstrate the methodology applying it to the redesign of the hydraulic head monitoring network of the Valle de Querétaro aquifer with the objective of selecting from a set of monitoring positions and times, those that minimize the spatiotemporal redundancy. The database for the geostatistical space-time analysis corresponds to information of 273 wells located within the aquifer for the period 1970-2007. A total of 1,435 hydraulic head data were used to construct the experimental space-time variogram. The results show that from the existing monitoring program that consists of 418 space-time monitoring points, only 178 are not redundant. The implied reduction of monitoring costs was possible because the proposed method is successful in propagating information in space and time.
A Fault-Tolerant Radiation-Robust Mass Storage Concept for Highly Scaled Flash Memory
NASA Astrophysics Data System (ADS)
Fuchs, Cristian M.; Trinitis, Carsten; Appel, Nicolas; Langer, Martin
2015-09-01
Future spacemissions will require vast amounts of data to be stored and processed aboard spacecraft. While satisfying operational mission requirements, storage systems must guarantee data integrity and recover damaged data throughout the mission. NAND-flash memories have become popular for space-borne high performance mass memory scenarios, though future storage concepts will rely upon highly scaled flash or other memory technologies. With modern flash memory, single bit erasure coding and RAID based concepts are insufficient. Thus, a fully run-time configurable, high performance, dependable storage concept, requiring a minimal set of logic or software. The solution is based on composite erasure coding and can be adjusted for altered mission duration or changing environmental conditions.
NASA Astrophysics Data System (ADS)
Ying, Kai; Kowalski, John M.; Nogami, Toshizo; Yin, Zhanping; Sheng, Jia
2018-01-01
5G systems are supposed to support coexistence of multiple services such as ultra reliable low latency communications (URLLC) and enhanced mobile broadband (eMBB) communications. The target of eMBB communications is to meet the high-throughput requirement while URLLC are used for some high priority services. Due to the sporadic nature and low latency requirement, URLLC transmission may pre-empt the resource of eMBB transmission. Our work is to analyze the URLLC impact on eMBB transmission in mobile front-haul. Then, some solutions are proposed to guarantee the reliability/latency requirements for URLLC services and minimize the impact to eMBB services at the same time.
Inlet-engine matching for SCAR including application of a bicone variable geometry inlet
NASA Technical Reports Server (NTRS)
Wasserbauer, J. F.; Gerstenmaier, W. H.
1978-01-01
Airflow characteristics of variable cycle engines (VCE) designed for Mach 2.32 can have transonic airflow requirements as high as 1.6 times the cruise airflow. This is a formidable requirement for conventional, high performance, axisymmetric, translating centerbody mixed compression inlets. An alternate inlet is defined, where the second cone of a two cone center body collapses to the initial cone angle to provide a large off-design airflow capability, and incorporates modest centerbody translation to minimize spillage drag. Estimates of transonic spillage drag are competitive with those of conventional translating centerbody inlets. The inlet's cruise performance exhibits very low bleed requirements with good recovery and high angle of attack capability.
Minimal algorithm for running an internal combustion engine
NASA Astrophysics Data System (ADS)
Stoica, V.; Borborean, A.; Ciocan, A.; Manciu, C.
2018-01-01
The internal combustion engine control is a well-known topic within automotive industry and is widely used. However, in research laboratories and universities the use of a control system trading is not the best solution because of predetermined operating algorithms, and calibrations (accessible only by the manufacturer) without allowing massive intervention from outside. Laboratory solutions on the market are very expensive. Consequently, in the paper we present a minimal algorithm required to start-up and run an internal combustion engine. The presented solution can be adapted to function on performance microcontrollers available on the market at the present time and at an affordable price. The presented algorithm was implemented in LabView and runs on a CompactRIO hardware platform.
McQueen, David A; Cooke, Francis W; Hahn, Dustan L
2005-01-01
The irretrievably failed total knee arthroplasty is the primary indication for knee arthrodesis. Because this difficult condition is relatively rare, an intramedullary arthrodesis system was developed which requires minimal surgeon experience for successful use. The new system called the Wichita Fusion Nail was implanted by a single surgeon in 13 consecutive patients: 11 for arthrodesis alone, 1 for stabilization of a supracondylar fracture nonunion, and 1 for arthrodesis coupled with a supracondylar fracture nonunion. All arthrodesis attempts were successful. The average fusion time was 15.2 weeks except for 2 infected delayed arthrodeses. Both fracture nonunions persisted and went on to amputation. The WFN provides a simple arthrodesis system with minimal technique dependence and a high potential for success.
Efficiency of unconstrained minimization techniques in nonlinear analysis
NASA Technical Reports Server (NTRS)
Kamat, M. P.; Knight, N. F., Jr.
1978-01-01
Unconstrained minimization algorithms have been critically evaluated for their effectiveness in solving structural problems involving geometric and material nonlinearities. The algorithms have been categorized as being zeroth, first, or second order depending upon the highest derivative of the function required by the algorithm. The sensitivity of these algorithms to the accuracy of derivatives clearly suggests using analytically derived gradients instead of finite difference approximations. The use of analytic gradients results in better control of the number of minimizations required for convergence to the exact solution.
Minimizing the area required for time constants in integrated circuits
NASA Technical Reports Server (NTRS)
Lyons, J. C.
1972-01-01
When a medium- or large-scale integrated circuit is designed, efforts are usually made to avoid the use of resistor-capacitor time constant generators. The capacitor needed for this circuit usually takes up more surface area on the chip than several resistors and transistors. When the use of this network is unavoidable, the designer usually makes an effort to see that the choice of resistor and capacitor combinations is such that a minimum amount of surface area is consumed. The optimum ratio of resistance to capacitance that will result in this minimum area is equal to the ratio of resistance to capacitance which may be obtained from a unit of surface area for the particular process being used. The minimum area required is a function of the square root of the reciprocal of the products of the resistance and capacitance per unit area. This minimum occurs when the area required by the resistor is equal to the area required by the capacitor.
Thermal Design to Meet Stringent Temperature Gradient/Stability Requirements of SWIFT BAT Detectors
NASA Technical Reports Server (NTRS)
Choi, Michael K.
2000-01-01
The Burst Alert Telescope (BAT) is an instrument on the National Aeronautics and Space Administration (NASA) SWIFT spacecraft. It is designed to detect gamma ray burst over a broad region of the sky and quickly align the telescopes on the spacecraft to the gamma ray source. The thermal requirements for the BAT detector arrays are very stringent. The maximum allowable temperature gradient of the 256 cadmium zinc telluride (CZT) detectors is PC. Also, the maximum allowable rate of temperature change of the ASICs of the 256 Detector Modules (DMs) is PC on any time scale. The total power dissipation of the DMs and Block Command & Data Handling (BCDH) is 180 W. This paper presents a thermal design that uses constant conductance heat pipes (CCHPs) to minimize the temperature gradient of the DMs, and loop heat pipes (LHPs) to transport the waste heat to the radiator. The LHPs vary the effective thermal conductance from the DMs to the radiator to minimize heater power to meet the heater power budget, and to improve the temperature stability. The DMs are cold biased, and active heater control is used to meet the temperature gradient and stability requirements.
An Efficient Interactive Model for On-Demand Sensing-As-A-Servicesof Sensor-Cloud
Dinh, Thanh; Kim, Younghan
2016-01-01
This paper proposes an efficient interactive model for the sensor-cloud to enable the sensor-cloud to efficiently provide on-demand sensing services for multiple applications with different requirements at the same time. The interactive model is designed for both the cloud and sensor nodes to optimize the resource consumption of physical sensors, as well as the bandwidth consumption of sensing traffic. In the model, the sensor-cloud plays a key role in aggregating application requests to minimize the workloads required for constrained physical nodes while guaranteeing that the requirements of all applications are satisfied. Physical sensor nodes perform their sensing under the guidance of the sensor-cloud. Based on the interactions with the sensor-cloud, physical sensor nodes adapt their scheduling accordingly to minimize their energy consumption. Comprehensive experimental results show that our proposed system achieves a significant improvement in terms of the energy consumption of physical sensors, the bandwidth consumption from the sink node to the sensor-cloud, the packet delivery latency, reliability and scalability, compared to current approaches. Based on the obtained results, we discuss the economical benefits and how the proposed system enables a win-win model in the sensor-cloud. PMID:27367689
An Efficient Interactive Model for On-Demand Sensing-As-A-Servicesof Sensor-Cloud.
Dinh, Thanh; Kim, Younghan
2016-06-28
This paper proposes an efficient interactive model for the sensor-cloud to enable the sensor-cloud to efficiently provide on-demand sensing services for multiple applications with different requirements at the same time. The interactive model is designed for both the cloud and sensor nodes to optimize the resource consumption of physical sensors, as well as the bandwidth consumption of sensing traffic. In the model, the sensor-cloud plays a key role in aggregating application requests to minimize the workloads required for constrained physical nodes while guaranteeing that the requirements of all applications are satisfied. Physical sensor nodes perform their sensing under the guidance of the sensor-cloud. Based on the interactions with the sensor-cloud, physical sensor nodes adapt their scheduling accordingly to minimize their energy consumption. Comprehensive experimental results show that our proposed system achieves a significant improvement in terms of the energy consumption of physical sensors, the bandwidth consumption from the sink node to the sensor-cloud, the packet delivery latency, reliability and scalability, compared to current approaches. Based on the obtained results, we discuss the economical benefits and how the proposed system enables a win-win model in the sensor-cloud.
Electrically tunable soft solid lens inspired by reptile and bird accommodation.
Pieroni, Michael; Lagomarsini, Clara; De Rossi, Danilo; Carpi, Federico
2016-10-26
Electrically tunable lenses are conceived as deformable adaptive optical components able to change focus without motor-controlled translations of stiff lenses. In order to achieve large tuning ranges, large deformations are needed. This requires new technologies for the actuation of highly stretchable lenses. This paper presents a configuration to obtain compact tunable lenses entirely made of soft solid matter (elastomers). This was achieved by combining the advantages of dielectric elastomer actuation (DEA) with a design inspired by the accommodation of reptiles and birds. An annular DEA was used to radially deform a central solid-body lens. Using an acrylic elastomer membrane, a silicone lens and a simple fabrication method, we assembled a tunable lens capable of focal length variations up to 55%, driven by an actuator four times larger than the lens. As compared to DEA-based liquid lenses, the novel architecture halves the required driving voltages, simplifies the fabrication process and allows for a higher versatility in design. These new lenses might find application in systems requiring large variations of focus with low power consumption, silent operation, low weight, shock tolerance, minimized axial encumbrance and minimized changes of performance against vibrations and variations in temperature.
A fuzzy goal programming model for biodiesel production
NASA Astrophysics Data System (ADS)
Lutero, D. S.; Pangue, EMU; Tubay, J. M.; Lubag, S. P.
2016-02-01
A fuzzy goal programming (FGP) model for biodiesel production in the Philippines was formulated with Coconut (Cocos nucifera) and Jatropha (Jatropha curcas) as sources of biodiesel. Objectives were maximization of feedstock production and overall revenue and, minimization of energy used in production and working capital for farming subject to biodiesel and non-biodiesel requirements, and availability of land, labor, water and machine time. All these objectives and constraints were assumed to be fuzzy. Model was tested for different sets of weights. Results for all sets of weights showed the same optimal allocation. Coconut alone can satisfy the biodiesel requirement of 2% per volume.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, Nicholas R.; Pointer, William David; Sieger, Matt
2016-04-01
The goal of this review is to enable application of codes or software packages for safety assessment of advanced sodium-cooled fast reactor (SFR) designs. To address near-term programmatic needs, the authors have focused on two objectives. First, the authors have focused on identification of requirements for software QA that must be satisfied to enable the application of software to future safety analyses. Second, the authors have collected best practices applied by other code development teams to minimize cost and time of initial code qualification activities and to recommend a path to the stated goal.
A Simplified Shuttle Payload Thermal Analyzer /SSPTA/ program
NASA Technical Reports Server (NTRS)
Bartoszek, J. T.; Huckins, B.; Coyle, M.
1979-01-01
A simple thermal analysis program for Space Shuttle payloads has been developed to accommodate the user who requires an easily understood but dependable analytical tool. The thermal analysis program includes several thermal subprograms traditionally employed in spacecraft thermal studies, a data management system for data generated by the subprograms, and a master program to coordinate the data files and thermal subprograms. The language and logic used to run the thermal analysis program are designed for the small user. In addition, analytical and storage techniques which conserve computer time and minimize core requirements are incorporated into the program.
Real-time implementation of second generation of audio multilevel information coding
NASA Astrophysics Data System (ADS)
Ali, Murtaza; Tewfik, Ahmed H.; Viswanathan, V.
1994-03-01
This paper describes real-time implementation of a novel wavelet- based audio compression method. This method is based on the discrete wavelet (DWT) representation of signals. A bit allocation procedure is used to allocate bits to the transform coefficients in an adaptive fashion. The bit allocation procedure has been designed to take advantage of the masking effect in human hearing. The procedure minimizes the number of bits required to represent each frame of audio signals at a fixed distortion level. The real-time implementation provides almost transparent compression of monophonic CD quality audio signals (samples at 44.1 KHz and quantized using 16 bits/sample) at bit rates of 64-78 Kbits/sec. Our implementation uses two ASPI Elf boards, each of which is built around a TI TMS230C31 DSP chip. The time required for encoding of a mono CD signal is about 92 percent of real time and that for decoding about 61 percent.
A Nonlinear Least Squares Approach to Time of Death Estimation Via Body Cooling.
Rodrigo, Marianito R
2016-01-01
The problem of time of death (TOD) estimation by body cooling is revisited by proposing a nonlinear least squares approach that takes as input a series of temperature readings only. Using a reformulation of the Marshall-Hoare double exponential formula and a technique for reducing the dimension of the state space, an error function that depends on the two cooling rates is constructed, with the aim of minimizing this function. Standard nonlinear optimization methods that are used to minimize the bivariate error function require an initial guess for these unknown rates. Hence, a systematic procedure based on the given temperature data is also proposed to determine an initial estimate for the rates. Then, an explicit formula for the TOD is given. Results of numerical simulations using both theoretical and experimental data are presented, both yielding reasonable estimates. The proposed procedure does not require knowledge of the temperature at death nor the body mass. In fact, the method allows the estimation of the temperature at death once the cooling rates and the TOD have been calculated. The procedure requires at least three temperature readings, although more measured readings could improve the estimates. With the aid of computerized recording and thermocouple detectors, temperature readings spaced 10-15 min apart, for example, can be taken. The formulas can be straightforwardly programmed and installed on a hand-held device for field use. © 2015 American Academy of Forensic Sciences.
The role of elastic energy in activities with high force and power requirements: a brief review.
Wilson, Jacob M; Flanagan, Eamonn P
2008-09-01
The purpose of this article is to provide strength and conditioning practitioners with an understanding of the role of elastic energy in activities with high force and power requirements. Specifically, the article covers 1) the nature of elasticity and its application to human participants, 2) the role of elastic energy in activities requiring a stretch-shorten cycle such as the vertical jump, 3) the role of muscular stiffness in athletic performance, 4) the control of muscular stiffness through feedforward and feedback mechanisms, and 5) factors affecting muscular stiffness. Finally, practical applications are provided. In this section, it is suggested that the storage and reuse of elastic energy is optimized at relatively higher levels of stiffness. Because stiffness decreases as fatigue ensues as well as with stretching before an event, the article emphasizes the need for proper preparation phases in a periodized cycle and the avoidance of long static stretches before high-force activities. The importance of teaching athletes to transition from eccentric to concentric movements with minimal time delays is also proposed due to the finding that time delays appear to decrease the reuse of elastic energy. In addition to teaching within the criterion tasks, evidence is provided that minimizing transitions in plyometric training, a technique demonstrated to increase musculotendinous stiffness, can optimize power output in explosive movements. Finally, evidence is provided that training and teaching programs designed to optimize muscular stiffness may protect athletes against sports-related injuries.
Bolton, William David; Cochran, Thomas; Ben-Or, Sharon; Stephenson, James E; Ellis, William; Hale, Allyson L; Binks, Andrew P
The aims of the study were to evaluate electromagnetic navigational bronchoscopy (ENB) and computed tomography-guided placement as localization techniques for minimally invasive resection of small pulmonary nodules and determine whether electromagnetic navigational bronchoscopy is a safer and more effective method than computed tomography-guided localization. We performed a retrospective review of our thoracic surgery database to identify patients who underwent minimally invasive resection for a pulmonary mass and used either electromagnetic navigational bronchoscopy or computed tomography-guided localization techniques between July 2011 and May 2015. Three hundred eighty-three patients had a minimally invasive resection during our study period, 117 of whom underwent electromagnetic navigational bronchoscopy or computed tomography localization (electromagnetic navigational bronchoscopy = 81; computed tomography = 36). There was no significant difference between computed tomography and electromagnetic navigational bronchoscopy patient groups with regard to age, sex, race, pathology, nodule size, or location. Both computed tomography and electromagnetic navigational bronchoscopy were 100% successful at localizing the mass, and there was no difference in the type of definitive surgical resection (wedge, segmentectomy, or lobectomy) (P = 0.320). Postoperative complications occurred in 36% of all patients, but there were no complications related to the localization procedures. In terms of localization time and surgical time, there was no difference between groups. However, the down/wait time between localization and resection was significant (computed tomography = 189 minutes; electromagnetic navigational bronchoscopy = 27 minutes); this explains why the difference in total time (sum of localization, down, and surgery) was significant (P < 0.001). We found electromagnetic navigational bronchoscopy to be as safe and effective as computed tomography-guided wire placement and to provide a significantly decreased down time between localization and surgical resection.
van Empel, Pieter J; Verdam, Mathilde G E; Strypet, Magnus; van Rijssen, Lennart B; Huirne, Judith A; Scheele, Fedde; Bonjer, H Jaap; Meijerink, W Jeroen
2012-01-01
Knot tying and suturing skills in minimally invasive surgery (MIS) differ markedly from those in open surgery. Appropriate MIS training is mandatory before implementation into practice. The Advanced Suturing Course (ASC) is a structured simulator based training course that includes a 6-week autonomous training period at home on a traditional laparoscopic box trainer. Previous research did not demonstrate a significant progress in laparoscopic skills after this training period. This study aims to identify factors determining autonomous training on a laparoscopic box trainer at home. Residents (n = 97) attending 1 of 7 ASC courses between January 2009 and June 2011 were consecutively included. After 6 weeks of autonomous, training a questionnaire was completed. A random subgroup of 30 residents was requested to keep a time log. All residents received an online survey after attending the ASC. We performed outcome comparison to examine the accuracy of individual responses. Out of 97 residents, the main motives for noncompliant autonomous training included a lack of (training) time after working hours (n = 80, 83.3%), preferred practice time during working hours (n = 76, 31.6%), or another surgical interest than MIS (n = 79, 15.2%). Previously set training goals would encourage autonomous training according to 27.8% (n = 18) of residents. Thirty participants submitted a time log and reported an average 76.5-minute weekly training time. All residents confirmed that autonomous home practice on a laparoscopic box trainer is valuable. Autonomous practice should be structured and inclusive of adequate and sufficient feedback points. A minimally required practice time should be set. An obligatory assessment, including corresponding consequence should be conducted. Compliance herewith may result in increased voluntary (autonomous) simulator based (laparoscopic) training by residents. Copyright © 2012 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.
36 CFR 228.8 - Requirements for environmental protection.
Code of Federal Regulations, 2012 CFR
2012-07-01
... shall be conducted so as, where feasible, to minimize adverse environmental impacts on National Forest..., shall either be removed from National Forest lands or disposed of or treated so as to minimize, so far... 36 Parks, Forests, and Public Property 2 2012-07-01 2012-07-01 false Requirements for...
36 CFR 228.8 - Requirements for environmental protection.
Code of Federal Regulations, 2014 CFR
2014-07-01
... shall be conducted so as, where feasible, to minimize adverse environmental impacts on National Forest..., shall either be removed from National Forest lands or disposed of or treated so as to minimize, so far... 36 Parks, Forests, and Public Property 2 2014-07-01 2014-07-01 false Requirements for...
36 CFR 228.8 - Requirements for environmental protection.
Code of Federal Regulations, 2013 CFR
2013-07-01
... shall be conducted so as, where feasible, to minimize adverse environmental impacts on National Forest..., shall either be removed from National Forest lands or disposed of or treated so as to minimize, so far... 36 Parks, Forests, and Public Property 2 2013-07-01 2013-07-01 false Requirements for...
40 CFR 63.6605 - What are my general requirements for complying with this subpart?
Code of Federal Regulations, 2010 CFR
2010-07-01
... maintain any affected source, including associated air pollution control equipment and monitoring equipment, in a manner consistent with safety and good air pollution control practices for minimizing emissions. The general duty to minimize emissions does not require you to make any further efforts to reduce...
Distributed simulation using a real-time shared memory network
NASA Technical Reports Server (NTRS)
Simon, Donald L.; Mattern, Duane L.; Wong, Edmond; Musgrave, Jeffrey L.
1993-01-01
The Advanced Control Technology Branch of the NASA Lewis Research Center performs research in the area of advanced digital controls for aeronautic and space propulsion systems. This work requires the real-time implementation of both control software and complex dynamical models of the propulsion system. We are implementing these systems in a distributed, multi-vendor computer environment. Therefore, a need exists for real-time communication and synchronization between the distributed multi-vendor computers. A shared memory network is a potential solution which offers several advantages over other real-time communication approaches. A candidate shared memory network was tested for basic performance. The shared memory network was then used to implement a distributed simulation of a ramjet engine. The accuracy and execution time of the distributed simulation was measured and compared to the performance of the non-partitioned simulation. The ease of partitioning the simulation, the minimal time required to develop for communication between the processors and the resulting execution time all indicate that the shared memory network is a real-time communication technique worthy of serious consideration.
Biotechnology Research Requirements for Aeronautical Systems through the Year 2000. Volume 1
1982-07-30
signature. Reducing the aero contain radic.?z,:"ve fallout. vehicle’s detectibility by reducing the optical, electro, acoustic, and infrared 0 Pulsed...arsenal devoted to agent doses. At this time, basic biochemical and chemical agents, reported use of mycotoxins and pharmacological data are minimal and...mass spectroscopy , quartz chemical agents must be developed. This research microbalances, Industrial hygiene dosimetry, damp should quantify human
Evaluation of Tissue Interactions with Mechanical Elements of a Transscleral Drug Delivery Device
Cohen, Sarah J.; Chan, Robison V. Paul; Keegan, Mark; Andreoli, Christopher M.; Borenstein, Jeffrey T.; Miller, Joan W.; Gragoudas, Evangelos S.
2012-01-01
The goal of this work was to evaluate tissue-device interactions due to implantation of a mechanically operated drug delivery system onto the posterior sclera. Two test devices were designed and fabricated to model elements of the drug delivery device—one containing a free-spinning ball bearing and the other encasing two articulating gears. Openings in the base of test devices modeled ports for drug passage from device to sclera. Porous poly(tetrafluoroethylene) (PTFE) membranes were attached to half of the gear devices to minimize tissue ingrowth through these ports. Test devices were sutured onto rabbit eyes for 10 weeks. Tissue-device interactions were evaluated histologically and mechanically after removal to determine effects on device function and changes in surrounding tissue. Test devices were generally well-tolerated during residence in the animal. All devices encouraged fibrous tissue formation between the sclera and the device, fibrous tissue encapsulation and invasion around the device, and inflammation of the conjunctiva. Gear devices encouraged significantly greater inflammation in all cases and a larger rate of tissue ingrowth. PTFE membranes prevented tissue invasion through the covered drug ports, though tissue migrated in through other smaller openings. The torque required to turn the mechanical elements increased over 1000 times for gear devices, but only on the order of 100 times for membrane-covered gear devices and less than 100 times for ball bearing devices. Maintaining a lower device profile, minimizing microscale motion on the eye surface and covering drug ports with a porous membrane may minimize inflammation, decreasing the risk of damage to surrounding tissues and minimizing disruption of device operation. PMID:24300189
Mundt, Christian; Sventitskiy, Alexander; Cehelsky, Jeffrey E.; Patters, Andrea B.; Tservistas, Markus; Hahn, Michael C.; Juhl, Gerd; DeVincenzo, John P.
2012-01-01
Background. New aerosol drugs for infants may require more efficient delivery systems, including face masks. Maximizing delivery efficiency requires tight-fitting masks with minimal internal mask volumes, which could cause carbon dioxide (CO2) retention. An RNA-interference-based antiviral for treatment of respiratory syncytial virus in populations that may include young children is designed for aerosol administration. CO2 accumulation within inhalation face masks has not been evaluated. Methods. We simulated airflow and CO2 concentrations accumulating over time within a new facemask designed for infants and young children (PARI SMARTMASK® Baby). A one-dimensional model was first examined, followed by 3-dimensional unsteady computational fluid dynamics analyses. Normal infant breathing patterns and respiratory distress were simulated. Results. The maximum average modeled CO2 concentration within the mask reached steady state (3.2% and 3% for normal and distressed breathing patterns resp.) after approximately the 5th respiratory cycle. After steady state, the mean CO2 concentration inspired into the nostril was 2.24% and 2.26% for normal and distressed breathing patterns, respectively. Conclusion. The mask is predicted to cause minimal CO2 retention and rebreathing. Infants with normal and distressed breathing should tolerate the mask intermittently delivering aerosols over brief time frames. PMID:22792479
Artificial Gravity for Mars Missions: The Different Design and Development Options
NASA Technical Reports Server (NTRS)
Murbach, Marcus; Arno, Roger D.
2000-01-01
One of the major impediments to human Mars missions is the development of appropriate countermeasures for long term physiological response to the micro-gravity environment. A plethora of countermeasure approaches have been advanced from strictly pharmacological measures to large diameter rotating spacecraft that would simulate a 1-g environment (the latter being the most conservative from a human health perspective). The different approaches have significantly different implications not only on the overall system design of a Mars Mission Vehicle (MMV) but on the necessary earth-orbiting platform that would be required to qualify the particular countermeasure system. it is found that these different design options can be conveniently categorized in terms of the order of magnitude of the rotation diameter required (100's, 10's, 1's, 0 meters). From this, the different mass penalties associated with each category can be generally compared. The overall objective of the countermeasure system should be to maximize crew safety and comfort, minimize exercise protocol time (i.e., the time per day that each crew member would have to participate in the exercise/countermeasure), maximize countermeasure effectiveness, and minimize the associated system mass penalty of the Mars Mission Vehicle (in terms of fraction of IMLEO - Injected Mass in Low Earth Orbit).
Multiple video sequences synchronization during minimally invasive surgery
NASA Astrophysics Data System (ADS)
Belhaoua, Abdelkrim; Moreau, Johan; Krebs, Alexandre; Waechter, Julien; Radoux, Jean-Pierre; Marescaux, Jacques
2016-03-01
Hybrid operating rooms are an important development in the medical ecosystem. They allow integrating, in the same procedure, the advantages of radiological imaging and surgical tools. However, one of the challenges faced by clinical engineers is to support the connectivity and interoperability of medical-electrical point-of-care devices. A system that could enable plug-and-play connectivity and interoperability for medical devices would improve patient safety, save hospitals time and money, and provide data for electronic medical records. In this paper, we propose a hardware platform dedicated to collect and synchronize multiple videos captured from medical equipment in real-time. The final objective is to integrate augmented reality technology into an operation room (OR) in order to assist the surgeon during a minimally invasive operation. To the best of our knowledge, there is no prior work dealing with hardware based video synchronization for augmented reality applications on OR. Whilst hardware synchronization methods can embed temporal value, so called timestamp, into each sequence on-the-y and require no post-processing, they require specialized hardware. However the design of our hardware is simple and generic. This approach was adopted and implemented in this work and its performance is evaluated by comparison to the start-of-the-art methods.
Leaver, Chad Andrew; Guttmann, Astrid; Zwarenstein, Merrick; Rowe, Brian H; Anderson, Geoff; Stukel, Therese; Golden, Brian; Bell, Robert; Morra, Dante; Abrams, Howard; Schull, Michael J
2009-06-08
Rigorous evaluation of an intervention requires that its allocation be unbiased with respect to confounders; this is especially difficult in complex, system-wide healthcare interventions. We developed a short survey instrument to identify factors for a minimization algorithm for the allocation of a hospital-level intervention to reduce emergency department (ED) waiting times in Ontario, Canada. Potential confounders influencing the intervention's success were identified by literature review, and grouped by healthcare setting specific change stages. An international multi-disciplinary (clinical, administrative, decision maker, management) panel evaluated these factors in a two-stage modified-delphi and nominal group process based on four domains: change readiness, evidence base, face validity, and clarity of definition. An original set of 33 factors were identified from the literature. The panel reduced the list to 12 in the first round survey. In the second survey, experts scored each factor according to the four domains; summary scores and consensus discussion resulted in the final selection and measurement of four hospital-level factors to be used in the minimization algorithm: improved patient flow as a hospital's leadership priority; physicians' receptiveness to organizational change; efficiency of bed management; and physician incentives supporting the change goal. We developed a simple tool designed to gather data from senior hospital administrators on factors likely to affect the success of a hospital patient flow improvement intervention. A minimization algorithm will ensure balanced allocation of the intervention with respect to these factors in study hospitals.
Responsible gambling: general principles and minimal requirements.
Blaszczynski, Alex; Collins, Peter; Fong, Davis; Ladouceur, Robert; Nower, Lia; Shaffer, Howard J; Tavares, Hermano; Venisse, Jean-Luc
2011-12-01
Many international jurisdictions have introduced responsible gambling programs. These programs intend to minimize negative consequences of excessive gambling, but vary considerably in their aims, focus, and content. Many responsible gambling programs lack a conceptual framework and, in the absence of empirical data, their components are based only on general considerations and impressions. This paper outlines the consensus viewpoint of an international group of researchers suggesting fundamental responsible gambling principles, roles of key stakeholders, and minimal requirements that stakeholders can use to frame and inform responsible gambling programs across jurisdictions. Such a framework does not purport to offer value statements regarding the legal status of gambling or its expansion. Rather, it proposes gambling-related initiatives aimed at government, industry, and individuals to promote responsible gambling and consumer protection. This paper argues that there is a set of basic principles and minimal requirements that should form the basis for every responsible gambling program.
PLA realizations for VLSI state machines
NASA Technical Reports Server (NTRS)
Gopalakrishnan, S.; Whitaker, S.; Maki, G.; Liu, K.
1990-01-01
A major problem associated with state assignment procedures for VLSI controllers is obtaining an assignment that produces minimal or near minimal logic. The key item in Programmable Logic Array (PLA) area minimization is the number of unique product terms required by the design equations. This paper presents a state assignment algorithm for minimizing the number of product terms required to implement a finite state machine using a PLA. Partition algebra with predecessor state information is used to derive a near optimal state assignment. A maximum bound on the number of product terms required can be obtained by inspecting the predecessor state information. The state assignment algorithm presented is much simpler than existing procedures and leads to the same number of product terms or less. An area-efficient PLA structure implemented in a 1.0 micron CMOS process is presented along with a summary of the performance for a controller implemented using this design procedure.
Universality in stochastic exponential growth.
Iyer-Biswas, Srividya; Crooks, Gavin E; Scherer, Norbert F; Dinner, Aaron R
2014-07-11
Recent imaging data for single bacterial cells reveal that their mean sizes grow exponentially in time and that their size distributions collapse to a single curve when rescaled by their means. An analogous result holds for the division-time distributions. A model is needed to delineate the minimal requirements for these scaling behaviors. We formulate a microscopic theory of stochastic exponential growth as a Master Equation that accounts for these observations, in contrast to existing quantitative models of stochastic exponential growth (e.g., the Black-Scholes equation or geometric Brownian motion). Our model, the stochastic Hinshelwood cycle (SHC), is an autocatalytic reaction cycle in which each molecular species catalyzes the production of the next. By finding exact analytical solutions to the SHC and the corresponding first passage time problem, we uncover universal signatures of fluctuations in exponential growth and division. The model makes minimal assumptions, and we describe how more complex reaction networks can reduce to such a cycle. We thus expect similar scalings to be discovered in stochastic processes resulting in exponential growth that appear in diverse contexts such as cosmology, finance, technology, and population growth.
Sloped Terrain Segmentation for Autonomous Drive Using Sparse 3D Point Cloud
Cho, Seoungjae; Kim, Jonghyun; Ikram, Warda; Cho, Kyungeun; Sim, Sungdae
2014-01-01
A ubiquitous environment for road travel that uses wireless networks requires the minimization of data exchange between vehicles. An algorithm that can segment the ground in real time is necessary to obtain location data between vehicles simultaneously executing autonomous drive. This paper proposes a framework for segmenting the ground in real time using a sparse three-dimensional (3D) point cloud acquired from undulating terrain. A sparse 3D point cloud can be acquired by scanning the geography using light detection and ranging (LiDAR) sensors. For efficient ground segmentation, 3D point clouds are quantized in units of volume pixels (voxels) and overlapping data is eliminated. We reduce nonoverlapping voxels to two dimensions by implementing a lowermost heightmap. The ground area is determined on the basis of the number of voxels in each voxel group. We execute ground segmentation in real time by proposing an approach to minimize the comparison between neighboring voxels. Furthermore, we experimentally verify that ground segmentation can be executed at about 19.31 ms per frame. PMID:25093204
Synchronizing MIDI and wireless EEG measurements during natural piano performance.
Zamm, Anna; Palmer, Caroline; Bauer, Anna-Katharina R; Bleichner, Martin G; Demos, Alexander P; Debener, Stefan
2017-07-08
Although music performance has been widely studied in the behavioural sciences, less work has addressed the underlying neural mechanisms, perhaps due to technical difficulties in acquiring high-quality neural data during tasks requiring natural motion. The advent of wireless electroencephalography (EEG) presents a solution to this problem by allowing for neural measurement with minimal motion artefacts. In the current study, we provide the first validation of a mobile wireless EEG system for capturing the neural dynamics associated with piano performance. First, we propose a novel method for synchronously recording music performance and wireless mobile EEG. Second, we provide results of several timing tests that characterize the timing accuracy of our system. Finally, we report EEG time domain and frequency domain results from N=40 pianists demonstrating that wireless EEG data capture the unique temporal signatures of musicians' performances with fine-grained precision and accuracy. Taken together, we demonstrate that mobile wireless EEG can be used to measure the neural dynamics of piano performance with minimal motion constraints. This opens many new possibilities for investigating the brain mechanisms underlying music performance. Copyright © 2017 Elsevier B.V. All rights reserved.
Universality in Stochastic Exponential Growth
NASA Astrophysics Data System (ADS)
Iyer-Biswas, Srividya; Crooks, Gavin E.; Scherer, Norbert F.; Dinner, Aaron R.
2014-07-01
Recent imaging data for single bacterial cells reveal that their mean sizes grow exponentially in time and that their size distributions collapse to a single curve when rescaled by their means. An analogous result holds for the division-time distributions. A model is needed to delineate the minimal requirements for these scaling behaviors. We formulate a microscopic theory of stochastic exponential growth as a Master Equation that accounts for these observations, in contrast to existing quantitative models of stochastic exponential growth (e.g., the Black-Scholes equation or geometric Brownian motion). Our model, the stochastic Hinshelwood cycle (SHC), is an autocatalytic reaction cycle in which each molecular species catalyzes the production of the next. By finding exact analytical solutions to the SHC and the corresponding first passage time problem, we uncover universal signatures of fluctuations in exponential growth and division. The model makes minimal assumptions, and we describe how more complex reaction networks can reduce to such a cycle. We thus expect similar scalings to be discovered in stochastic processes resulting in exponential growth that appear in diverse contexts such as cosmology, finance, technology, and population growth.
Sampling strategies for estimating acute and chronic exposures of pesticides in streams
Crawford, Charles G.
2004-01-01
The Food Quality Protection Act of 1996 requires that human exposure to pesticides through drinking water be considered when establishing pesticide tolerances in food. Several systematic and seasonally weighted systematic sampling strategies for estimating pesticide concentrations in surface water were evaluated through Monte Carlo simulation, using intensive datasets from four sites in northwestern Ohio. The number of samples for the strategies ranged from 4 to 120 per year. Sampling strategies with a minimal sampling frequency outside the growing season can be used for estimating time weighted mean and percentile concentrations of pesticides with little loss of accuracy and precision, compared to strategies with the same sampling frequency year round. Less frequent sampling strategies can be used at large sites. A sampling frequency of 10 times monthly during the pesticide runoff period at a 90 km 2 basin and four times monthly at a 16,400 km2 basin provided estimates of the time weighted mean, 90th, 95th, and 99th percentile concentrations that fell within 50 percent of the true value virtually all of the time. By taking into account basin size and the periodic nature of pesticide runoff, costs of obtaining estimates of time weighted mean and percentile pesticide concentrations can be minimized.
Kim, Ji Wan; Kim, Hyun Uk; Oh, Chang-Wug; Kim, Joon-Woo; Park, Ki Chul
2018-01-01
To compare the radiologic and clinical results of minimally invasive plate osteosynthesis (MIPO) and minimal open reduction and internal fixation (ORIF) for simple distal tibial fractures. Randomized prospective study. Three level 1 trauma centers. Fifty-eight patients with simple and distal tibial fractures were randomized into a MIPO group (treatment with MIPO; n = 29) or a minimal group (treatment with minimal ORIF; n = 29). These numbers were designed to define the rate of soft tissue complication; therefore, validation of superiority in union time or determination of differences in rates of delayed union was limited in this study. Simple distal tibial fractures treated with MIPO or minimal ORIF. The clinical outcome measurements included operative time, radiation exposure time, and soft tissue complications. To evaluate a patient's function, the American Orthopedic Foot and Ankle Society ankle score (AOFAS) was used. Radiologic measurements included fracture alignment, delayed union, and union time. All patients acquired bone union without any secondary intervention. The mean union time was 17.4 weeks and 16.3 weeks in the MIPO and minimal groups, respectively. There was 1 case of delayed union and 1 case of superficial infection in each group. The radiation exposure time was shorter in the minimal group than in the MIPO group. Coronal angulation showed a difference between both groups. The American Orthopedic Foot and Ankle Society ankle scores were 86.0 and 86.7 in the MIPO and minimal groups, respectively. Minimal ORIF resulted in similar outcomes, with no increased rate of soft tissue problems compared to MIPO. Both MIPO and minimal ORIF have high union rates and good functional outcomes for simple distal tibial fractures. Minimal ORIF did not result in increased rates of infection and wound dehiscence. Therapeutic Level II. See Instructions for Authors for a complete description of levels of evidence.
76 FR 30550 - Federal Management Regulation; Change in Consumer Price Index Minimal Value
Federal Register 2010, 2011, 2012, 2013, 2014
2011-05-26
... Minimal Value AGENCY: Office of Governmentwide Policy, GSA. ACTION: Final rule. SUMMARY: Pursuant to 5 U.S.C. 7342, at three-year intervals following January 1, 1981, the minimal value for foreign gifts must... required consultation has been completed and the minimal value has been increased to $350 or less as of...
Vesicant extravasation part II: Evidence-based management and continuing controversies.
Wickham, Rita; Engelking, Constance; Sauerland, Carmel; Corbi, Dominick
2006-11-27
To review the literature, synthesize current recommendations, and discuss remaining controversies regarding vesicant extravasation management. Published evidence-based reports, clinical articles, and anecdotal case reports about antineoplastic and nonantineoplastic vesicant agent management. Prevention of vesicant extravasation sequelae requires knowledge about vesicant extravasation manifestations and differentiation of vesicant extravasation from other local IV site reactions. When evidence is weak or missing, logical application of data-based or empirical management strategies is critical. Actions may include timely administration of subcutaneous or topical antidotes, comfort measures, and surgical interventions to minimize the extent of tissue damage and morbidity should extravasation occur. Vesicant extravasation and sequelae constitute a complex patient problem. Clinicians should strive to prevent extravasation or seek to minimize injury should it occur. To this end, clinicians must demonstrate awareness of its risks and use specialized knowledge when administering vesicant agents. Nurses who administer vesicant agents should understand the nursing and collaborative actions that should be taken to minimize patient morbidity, pain, and disability.
Minimizing target interference in PK immunoassays: new approaches for low-pH-sample treatment.
Partridge, Michael A; Pham, John; Dziadiv, Olena; Luong, Onson; Rafique, Ashique; Sumner, Giane; Torri, Albert
2013-08-01
Quantitating total levels of monoclonal antibody (mAb) biotherapeutics in serum using ELISA may be hindered by soluble targets. We developed two low-pH-sample-pretreatment techniques to minimize target interference. The first procedure involves sample pretreatment at pH <3.0 before neutralization and analysis in a target capture ELISA. Careful monitoring of acidification time is required to minimize potential impact on mAb detection. The second approach involves sample dilution into mild acid (pH ∼4.5) before transferring to an anti-human capture-antibody-coated plate without neutralization. Analysis of target-drug and drug-capture antibody interactions at pH 4.5 indicated that the capture antibody binds to the drug, while the drug and the target were dissociated. Using these procedures, total biotherapeutic levels were accurately measured when soluble target was >30-fold molar excess. These techniques provide alternatives for quantitating mAb biotherapeutics in the presence of a target when standard acid-dissociation procedures are ineffective.
Basic autonomy as a fundamental step in the synthesis of life.
Ruiz-Mirazo, Kepa; Moreno, Alvaro
2004-01-01
In the search for the primary roots of autonomy (a pivotal concept in Varela's comprehensive understanding of living beings), the theory of autopoiesis provided an explicit criterion to define minimal life in universal terms, and was taken as a guideline in the research program for the artificial synthesis of biological systems. Acknowledging the invaluable contribution of the autopoietic school to present biological thinking, we offer an alternative way of conceiving the most basic forms of autonomy. We give a bottom-up account of the origins of "self-production" (or self-construction, as we propose to call it), pointing out which are the minimal material and energetic requirements for the constitution of basic autonomous systems. This account is, indeed, committed to the project of developing a general theory of biology, but well grounded in the universal laws of physics and chemistry. We consider that the autopoietic theory was formulated in highly abstract terms and, in order to advance in the implementation of minimal autonomous systems (and, at the same time, make major progress in exploring the origins of life), a more specific characterization of minimal autonomous systems is required. Such a characterization will not be drawn from a review of the autopoietic criteria and terminology (à la Fleischaker) but demands a whole reformulation of the question: a proper naturalization of the concept of autonomy. Finally, we also discuss why basic autonomy, according to our account, is necessary but not sufficient for life, in contrast with Varela's idea that autopoiesis was a necessary and sufficient condition for it.
Attitude-Independent Magnetometer Calibration for Spin-Stabilized Spacecraft
NASA Technical Reports Server (NTRS)
Natanson, Gregory
2005-01-01
The paper describes a three-step estimator to calibrate a Three-Axis Magnetometer (TAM) using TAM and slit Sun or star sensor measurements. In the first step, the Calibration Utility forms a loss function from the residuals of the magnitude of the geomagnetic field. This loss function is minimized with respect to biases, scale factors, and nonorthogonality corrections. The second step minimizes residuals of the projection of the geomagnetic field onto the spin axis under the assumption that spacecraft nutation has been suppressed by a nutation damper. Minimization is done with respect to various directions of the body spin axis in the TAM frame. The direction of the spin axis in the inertial coordinate system required for the residual computation is assumed to be unchanged with time. It is either determined independently using other sensors or included in the estimation parameters. In both cases all estimation parameters can be found using simple analytical formulas derived in the paper. The last step is to minimize a third loss function formed by residuals of the dot product between the geomagnetic field and Sun or star vector with respect to the misalignment angle about the body spin axis. The method is illustrated by calibrating TAM for the Fast Auroral Snapshot Explorer (FAST) using in-flight TAM and Sun sensor data. The estimated parameters include magnetic biases, scale factors, and misalignment angles of the spin axis in the TAM frame. Estimation of the misalignment angle about the spin axis was inconclusive since (at least for the selected time interval) the Sun vector was about 15 degrees from the direction of the spin axis; as a result residuals of the dot product between the geomagnetic field and Sun vectors were to a large extent minimized as a by-product of the second step.
Performance characteristics of a slagging gasifier for MHD combustor systems
NASA Technical Reports Server (NTRS)
Smith, K. O.
1979-01-01
The performance of a two stage, coal combustor concept for magnetohydrodynamic (MHD) systems was investigated analytically. The two stage MHD combustor is comprised of an entrained flow, slagging gasifier as the first stage, and a gas phase reactor as the second stage. The first stage was modeled by assuming instantaneous coal devolatilization, and volatiles combustion and char gasification by CO2 and H2O in plug flow. The second stage combustor was modeled assuming adiabatic instantaneous gas phase reactions. Of primary interest was the dependence of char gasification efficiency on first stage particle residence time. The influence of first stage stoichiometry, heat loss, coal moisture, coal size distribution, and degree of coal devolatilization on gasifier performance and second stage exhaust temperature was determined. Performance predictions indicate that particle residence times on the order of 500 msec would be required to achieve gasification efficiencies in the range of 90 to 95 percent. The use of a finer coal size distribution significantly reduces the required gasifier residence time for acceptable levels of fuel use efficiency. Residence time requirements are also decreased by increased levels of coal devolatilization. Combustor design efforts should maximize devolatilization by minimizing mixing times associated with coal injection.
Real-time geometry-aware augmented reality in minimally invasive surgery.
Chen, Long; Tang, Wen; John, Nigel W
2017-10-01
The potential of augmented reality (AR) technology to assist minimally invasive surgery (MIS) lies in its computational performance and accuracy in dealing with challenging MIS scenes. Even with the latest hardware and software technologies, achieving both real-time and accurate augmented information overlay in MIS is still a formidable task. In this Letter, the authors present a novel real-time AR framework for MIS that achieves interactive geometric aware AR in endoscopic surgery with stereo views. The authors' framework tracks the movement of the endoscopic camera and simultaneously reconstructs a dense geometric mesh of the MIS scene. The movement of the camera is predicted by minimising the re-projection error to achieve a fast tracking performance, while the three-dimensional mesh is incrementally built by a dense zero mean normalised cross-correlation stereo-matching method to improve the accuracy of the surface reconstruction. The proposed system does not require any prior template or pre-operative scan and can infer the geometric information intra-operatively in real time. With the geometric information available, the proposed AR framework is able to interactively add annotations, localisation of tumours and vessels, and measurement labelling with greater precision and accuracy compared with the state-of-the-art approaches.
Steady state method to determine unsaturated hydraulic conductivity at the ambient water potential
HUbbell, Joel M.
2014-08-19
The present invention relates to a new laboratory apparatus for measuring the unsaturated hydraulic conductivity at a single water potential. One or more embodiments of the invented apparatus can be used over a wide range of water potential values within the tensiometric range, requires minimal laboratory preparation, and operates unattended for extended periods with minimal supervision. The present invention relates to a new laboratory apparatus for measuring the unsaturated hydraulic conductivity at a single water potential. One or more embodiments of the invented apparatus can be used over a wide range of water potential values within the tensiometric range, requires minimal laboratory preparation, and operates unattended for extended periods with minimal supervision.
Todor, Adrian; Pojar, Adina; Lucaciu, Dan
2013-01-01
The aim of the study was to evaluate the results of minimally invasive treatment of trochanteric fractures with the use of intramedullary nails. From September 2010 to September 2012 we treated 21 patients with pertrochanteric fractures by a minimally invasive technique using the Gamma 3 (Stryker, Howmedica) nail. There were 13 females and 8 men with a mean age of 74.1 years, ranging from 58 to 88 years. Fractures were classified as being stable (AO type 31-A1) in 5 cases and unstable (AO type 31-A2 and A3) in the rest of 16 cases. Patients were reviewed at 6 weeks and 3 months postoperatively. Mean surgery time was 46.8 minutes and mean hospital stay was 14.9 days. No patients required blood transfusions. During the hospital stay all the patients were mobilized with weight bearing as tolerated. All patients were available for review at 6 weeks, and 2 were lost to the 3 months follow up. 16 patients regained the previous level of activity. This minimally invasive technique using a gamma nail device for pertrochanteric fractures gives reliable good results with excellent preservation of hip function.
Daolagupu, Arup K; Mudgal, Ashwani; Agarwala, Vikash; Dutta, Kaushik K
2017-01-01
Extraarticular distal tibial fractures are among the most challenging fractures encountered by an orthopedician for treatment because of its subcutaneous location, poor blood supply and decreased muscular cover anteriorly, complications such as delayed union, nonunion, wound infection, and wound dehiscence are often seen as a great challenge to the surgeon. Minimally invasive plate osteosynthesis (MIPO) and intramedullary interlocking nail (IMLN) are two well-accepted and effective methods, but each has been historically related to complications. This study compares clinical and radiological outcome in extraarticular distal tibia fractures treated by intramedullary interlocking nail (IMLN) and minimally invasive plate osteosynthesis (MIPO). 42 patients included in this study, 21 underwent IMLN and 21 were treated with MIPO who met the inclusion criteria and operated between June 2014 and May 2015. Patients were followed up for clinical and radiological evaluation. In IMLN group, average union time was 18.26 weeks compared to 21.70 weeks in plating group which was significant ( P < 0.0001). Average time required for partial and full weight bearing in the nailing group was 4.95 weeks and 10.09 weeks respectively which was significantly less ( P < 0.0001) as compared to 6.90 weeks and 13.38 weeks in the plating group. Lesser complications in terms of implant irritation, ankle stiffness, and infection, were seen in interlocking group as compared to plating group. Average functional outcome according to American Orthopedic Foot and Ankle Society score was measured which came out to be 96.67. IMLN group was associated with lesser duration of surgery, earlier weight bearing and union rate, lesser incidence of infection and implant irritation which makes it a preferable choice for fixation of extra-articular distal tibial fractures. However, larger randomized controlled trials are required for confirming the results.
Pasalic, Dario; Funk, Ryan K; García, Joaquín J; Price, Daniel L; Price, Katharine A; Harmsen, William S; Patel, Samir H; Young, Geoffrey D; Foote, Robert L; Moore, Eric J; Ma, Daniel J
2018-07-01
To determine the outcomes and toxicities of minimally-invasive surgery with adjuvant intensity-modulated radiotherapy +/- chemotherapy (AT) compared to definitive surgical therapy (ST) in a contemporary cohort of HPV-positive oropharyngeal squamous cell carcinoma (OPSCC). From 2005 to 2013, a consecutive cohort of 190 HPV-positive OPSCC patients was retrospectively reviewed from multi-institutional databases maintained by the Departments of Otorhinolaryngology and Radiation Oncology. A total of 116 AT patients and 42 ST patients with intermediate or high risk pathologic features were included in the final analysis. All patients received minimally invasive surgery. Time to recurrence and time to death from the onset of surgery were evaluated. Toxicity data collected included dysphagia or xerostomia requiring feeding tube placement >6 months, or mandibular osteonecrosis requiring surgery or hyperbaric oxygen. All AT patients received IMRT to a median dose of 60 Gy. Chemotherapy delivered to 67.2% of AT patients. AT group included more high-risk patients given higher nodal classification (p = 0.005) and extracapsular extension (p = 0.0005). AT improved disease-free survival (HR 2.77, CI 1.22-6.28; p = 0.02) and local-regional control (HR 14.83, CI 3.240-67.839; p = 0.001). Disease-free survival with AT and tumor extracapsular extension was improved when compared to ST (HR of 4.34, CI 1.540-12.213; p = 0.006). Dysphagia or mandibular osteonecrosis toxicity after AT vs. ST of 19.0% vs. 2.4%. AT improved local-regional control and disease-free survival but was associated with greater toxicity. The recurrence benefit was most pronounced in tumors with extracapsular extension. Copyright © 2018 Elsevier Ltd. All rights reserved.
Interoperability Across the Stewardship Spectrum in the DataONE Repository Federation
NASA Astrophysics Data System (ADS)
Jones, M. B.; Vieglais, D.; Wilson, B. E.
2016-12-01
Thousands of earth and environmental science repositories serve many researchers and communities, each with their own community and legal mandates, sustainability models, and historical infrastructure. These repositories span the stewardship spectrum from highly curated collections that employ large numbers of staff members to review and improve data, to small, minimal budget repositories that accept data caveat emptor and where all responsibility for quality lies with the submitter. Each repository fills a niche, providing services that meet the stewardship tradeoffs of one or more communities. We have reviewed these stewardship tradeoffs for several DataONE member repositories ranging from minimally (KNB) to highly curated (Arctic Data Center), as well as general purpose (Dryad) to highly discipline or project specific (NEON). The rationale behind different levels of stewardship reflect resolution of these tradeoffs. Some repositories aim to encourage extensive uptake by keeping processes simple and minimizing the amount of information collected, but this limits the long-term utility of the data and the search, discovery, and integration systems that are possible. Other repositories require extensive metadata input, review, and assessment, allowing for excellent preservation, discovery, and integration but at the cost of significant time for submitters and expense for curatorial staff. DataONE recognizes these different levels of curation, and attempts to embrace them to create a federation that is useful across the stewardship spectrum. DataONE provides a tiered model for repositories with growing utility of DataONE services at higher tiers of curation. The lowest tier supports read-only access to data and requires little more than title and contact metadata. Repositories can gradually phase in support for higher levels of metadata and services as needed. These tiered capabilities are possible through flexible support for multiple metadata standards and services, where repositories can incrementally increase their requirements as they want to satisfy more use cases. Within DataONE, metadata search services support minimal metadata models, but significantly expanded precision and recall become possible when repositories provide more extensively curated metadata.
Endometrial ablation: normal appearance and complications.
Drylewicz, Monica R; Robinson, Kathryn; Siegel, Cary Lynn
2018-03-14
Global endometrial ablation is a commonly performed, minimally invasive technique aimed at improving/resolving abnormal uterine bleeding and menorrhagia in women. As non-resectoscopic techniques have come into existence, endometrial ablation performance continues to increase due to accessibility and decreased requirements for operating room time and advanced technical training. The increased utilization of this method translates into increased imaging of patients who have undergone the procedure. An understanding of the expected imaging appearances of endometrial ablation using different modalities is important for the abdominal radiologist. In addition, the frequent usage of the technique naturally comes with complications requiring appropriate imaging work-up. We review the expected appearance of the post-endometrial ablated uterus on multiple imaging modalities and demonstrate the more common and rare complications seen in the immediate post-procedural time period and remotely.
NASA Astrophysics Data System (ADS)
Devendran, Citsabehsan; Collins, David J.; Ai, Ye; Neild, Adrian
2017-04-01
Periodic pattern generation using time-averaged acoustic forces conventionally requires the intersection of counterpropagating wave fields, where suspended micro-objects in a microfluidic system collect along force potential minimizing nodal or antinodal lines. Whereas this effect typically requires either multiple transducer elements or whole channel resonance, we report the generation of scalable periodic patterning positions without either of these conditions. A single propagating surface acoustic wave interacts with the proximal channel wall to produce a knife-edge effect according to the Huygens-Fresnel principle, where these cylindrically propagating waves interfere with classical wave fronts emanating from the substrate. We simulate these conditions and describe a model that accurately predicts the lateral spacing of these positions in a robust and novel approach to acoustic patterning.
Mining Deployment Optimization
NASA Astrophysics Data System (ADS)
Čech, Jozef
2016-09-01
The deployment problem, researched primarily in the military sector, is emerging in some other industries, mining included. The principal decision is how to deploy some activities in space and time to achieve desired outcome while complying with certain requirements or limits. Requirements and limits are on the side constraints, while minimizing costs or maximizing some benefits are on the side of objectives. A model with application to mining of polymetallic deposit is presented. To obtain quick and immediate decision solutions for a mining engineer with experimental possibilities is the main intention of a computer-based tool. The task is to determine strategic deployment of mining activities on a deposit, meeting planned output from the mine and at the same time complying with limited reserves and haulage capacities. Priorities and benefits can be formulated by the planner.
Patient safety and minimizing risk with insulin administration - role of insulin degludec.
Aye, Myint M; Atkin, Stephen L
2014-01-01
Diabetes is a lifelong condition requiring ongoing medical care and patient self-management. Exogenous insulin therapy is essential in type 1 diabetes and becomes a necessity in patients with longstanding type 2 diabetes who fail to achieve optimal control with lifestyle modification, oral agents, and glucagon-like peptide 1-based therapy. One of the risks that hinders insulin use is hypoglycemia. Optimal insulin therapy should therefore minimize the risk of hypoglycemia while improving glycemic control. Insulin degludec (IDeg) is a novel basal insulin that, following subcutaneous injection, assembles into a depot of soluble multihexamer chains. These subsequently release IDeg monomers that are absorbed at a slow and steady rate into the circulation, with the terminal half-life of IDeg being ~25 hours. Thus, it requires only once-daily dosing unlike other basal insulin preparations that often require twice-daily dosing. Despite its long half-life, once-daily IDeg does not cause accumulation of insulin in the circulation after reaching steady state. IDeg once a day will produce a steady-state profile with a lower peak:trough ratio than other basal insulins. In clinical trials, this profile translates into a lower frequency of nocturnal hypoglycemia compared with insulin glargine, as well as an ability to allow some flexibility in dose timing without compromising efficacy and safety. Indeed, a study that tested the extremes of dosing intervals of 8 and 40 hours showed no detriment in either glycemic control or hypoglycemic frequency versus insulin glargine given at the same time each day. While extreme flexibility in dose timing is not recommended, these findings are reassuring. This may be particularly beneficial to elderly patients, patients with learning difficulties, or others who have to rely on health-care professionals for their daily insulin injections. Further studies are required to confirm whether this might benefit adherence to treatment, reduce long-term hypoglycemia or reduce diabetes-related complications.
NASA Astrophysics Data System (ADS)
Martins, C. G.; Behrens, J. H.; Destro, M. T.; Franco, B. D. G. M.; Vizeu, D. M.; Hutzler, B.; Landgraf, M.
2004-09-01
Consumer attitudes towards foods have changed in the last two decades increasing requirements for freshlike products. Consequently, less extreme treatments or additives are being required. Minimally processed foods have freshlike characteristics and satisfy this new consumer demand. Besides freshness, the minimally processing also provide convenience required by the market. Salad vegetables can be source of pathogen such as Salmonella, Escherichia coli O157:H7, Shigella spp. The minimal processing does not reduce the levels of pathogenic microorganisms to safe levels. Therefore, this study was carried out in order to improve the microbiological safety and the shelf-life of minimally processed vegetables using gamma radiation. Minimally processed watercress inoculated with a cocktail of Salmonella spp was exposed to 0.0, 0.2, 0.5, 0.7, 1.0, 1.2 and 1.5 kGy. Irradiated samples were diluted 1:10 in saline peptone water and plated onto tryptic soy agar that were incubated at 37°C/24 h. D 10 values for Salmonella spp. inoculated in watercress varied from 0.29 to 0.43 kGy. Therefore, a dose of 1.7 kGy will reduce Salmonella population in watercress by 4 log 10. The shelf-life was increased by 1 {1}/{2} day when the product was exposed to 1 kGy.
Control at stability's edge minimizes energetic costs: expert stick balancing
Meyer, Ryan; Zhvanetsky, Max; Ridge, Sarah; Insperger, Tamás
2016-01-01
Stick balancing on the fingertip is a complex voluntary motor task that requires the stabilization of an unstable system. For seated expert stick balancers, the time delay is 0.23 s, the shortest stick that can be balanced for 240 s is 0.32 m and there is a ° dead zone for the estimation of the vertical displacement angle in the saggital plane. These observations motivate a switching-type, pendulum–cart model for balance control which uses an internal model to compensate for the time delay by predicting the sensory consequences of the stick's movements. Numerical simulations using the semi-discretization method suggest that the feedback gains are tuned near the edge of stability. For these choices of the feedback gains, the cost function which takes into account the position of the fingertip and the corrective forces is minimized. Thus, expert stick balancers optimize control with a combination of quick manoeuvrability and minimum energy expenditures. PMID:27278361
A Technical Survey on Optimization of Processing Geo Distributed Data
NASA Astrophysics Data System (ADS)
Naga Malleswari, T. Y. J.; Ushasukhanya, S.; Nithyakalyani, A.; Girija, S.
2018-04-01
With growing cloud services and technology, there is growth in some geographically distributed data centers to store large amounts of data. Analysis of geo-distributed data is required in various services for data processing, storage of essential information, etc., processing this geo-distributed data and performing analytics on this data is a challenging task. The distributed data processing is accompanied by issues in storage, computation and communication. The key issues to be dealt with are time efficiency, cost minimization, utility maximization. This paper describes various optimization methods like end-to-end multiphase, G-MR, etc., using the techniques like Map-Reduce, CDS (Community Detection based Scheduling), ROUT, Workload-Aware Scheduling, SAGE, AMP (Ant Colony Optimization) to handle these issues. In this paper various optimization methods and techniques used are analyzed. It has been observed that end-to end multiphase achieves time efficiency; Cost minimization concentrates to achieve Quality of Service, Computation and reduction of Communication cost. SAGE achieves performance improvisation in processing geo-distributed data sets.
Performance and Health Test Procedure for Grid Energy Storage Systems: Preprint
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baggu, Murali M; Smith, Kandler A; Friedl, Andrew
A test procedure to evaluate the performance and health of field installations of grid-connected battery energy storage systems (BESS) is described. Performance and health metrics captured in the procedures are: Round-trip efficiency, Standby losses, Response time/accuracy, and Useable Energy/ State of Charge at different discharge/charge rates over the system's lifetime. The procedures are divided into Reference Performance Tests, which require the system to be put in a test mode and are to be conducted in intervals, and Real-time Monitoring tests, which collect data during normal operation without interruption. The procedures can be applied on a wide array of BESS withmore » little modifications and can thus support BESS operators in the management of BESS field installations with minimal interruption and expenditures.can be applied on a wide array of BESS with little modifications and can thus support BESS operators in the management of BESS field installations with minimal interruption and expenditures.« less
Manipulation and handling processes off-line programming and optimization with use of K-Roset
NASA Astrophysics Data System (ADS)
Gołda, G.; Kampa, A.
2017-08-01
Contemporary trends in development of efficient, flexible manufacturing systems require practical implementation of modern “Lean production” concepts for maximizing customer value through minimizing all wastes in manufacturing and logistics processes. Every FMS is built on the basis of automated and robotized production cells. Except flexible CNC machine tools and other equipments, the industrial robots are primary elements of the system. In the studies, authors look for wastes of time and cost in real tasks of robots, during manipulation processes. According to aspiration for optimization of handling and manipulation processes with use of the robots, the application of modern off-line programming methods and computer simulation, is the best solution and it is only way to minimize unnecessary movements and other instructions. The modelling process of robotized production cell and offline programming of Kawasaki robots in AS-Language will be described. The simulation of robotized workstation will be realized with use of virtual reality software K-Roset. Authors show the process of industrial robot’s programs improvement and optimization in terms of minimizing the number of useless manipulator movements and unnecessary instructions. This is realized in order to shorten the time of production cycles. This will also reduce costs of handling, manipulations and technological process.
Liu, Hongtao; Johnson, Jeffrey L.; Koval, Greg; Malnassy, Greg; Sher, Dorie; Damon, Lloyd E.; Hsi, Eric D.; Bucci, Donna Marie; Linker, Charles A.; Cheson, Bruce D.; Stock, Wendy
2012-01-01
Background In the present study, the prognostic impact of minimal residual disease during treatment on time to progression and overall survival was analyzed prospectively in patients with mantle cell lymphoma treated on the Cancer and Leukemia Group B 59909 clinical trial. Design and Methods Peripheral blood and bone marrow samples were collected during different phases of the Cancer and Leukemia Group B 59909 study for minimal residual disease analysis. Minimal residual disease status was determined by quantitative polymerase chain reaction of IgH and/or BCL-1/JH gene rearrangement. Correlation of minimal residual disease status with time to progression and overall survival was determined. In multivariable analysis, minimal residual disease, and other risk factors were correlated with time to progression. Results Thirty-nine patients had evaluable, sequential peripheral blood and bone marrow samples for minimal residual disease analysis. Using peripheral blood monitoring, 18 of 39 (46%) achieved molecular remission following induction therapy. The molecular remission rate increased from 46 to 74% after one course of intensification therapy. Twelve of 21 minimal residual disease positive patients (57%) progressed within three years of follow up compared to 4 of 18 (22%) molecular remission patients (P=0.049). Detection of minimal residual disease following induction therapy predicted disease progression with a hazard ratio of 3.7 (P=0.016). The 3-year probability of time to progression among those who were in molecular remission after induction chemotherapy was 82% compared to 48% in patients with detectable minimal residual disease. The prediction of time to progression by post-induction minimal residual disease was independent of other prognostic factors in multivariable analysis. Conclusions Detection of minimal residual disease following induction immunochemotherapy was an independent predictor of time to progression following immunochemotherapy and autologous stem cell transplantation for mantle cell lymphoma. The clinical trial was registered at ClinicalTrials.gov: NCT00020943. PMID:22102709
Şahin, Nur; Genc, Mine; Turan, Gülüzar Arzu; Kasap, Esin; Güçlü, Serkan
2018-03-13
The modified Misgav-Ladach method (MML) is a minimally invasive cesarean section procedure compared with the classic Pfannenstiel-Kerr (PK) method. The aim of the study was to compare the MML method and the PK method in terms of intraoperative and short-term postoperative outcomes. This prospective, randomized controlled trial involved 252 pregnant women scheduled for primary emergency or elective cesarean section between October, 2014 and July, 2015. The primary outcome measures were the duration of surgery, extraction time, Apgar score, blood loss, wound complications, and number of sutures used. Secondary outcome measures were the wound infection, time of bowel restitution, visual analogue scale (VAS) scores at 6 h and 24 h after the operation, limitations in movement, and analgesic requirements. At 6 weeks after surgery, the patients were evaluated regarding late complications. There was a significant reduction in total operating and extraction time in the MML group (p < 0.001). Limitations in movement were lower at 24 h after the MML operation, and less analgesic was required in the MML group. There was no difference between the 2 groups in terms of febrile morbidity or the duration of hospitalization. At 6 weeks after the operation, no complaints and no additional complications from the surgery were noted. The MML method is a minimally invasive cesarean section. In the future, as surgeons' experience increases, MML will likely be chosen more often than the classic PK method.
Posterior retroperitoneoscopic adrenalectomy: outcomes and lessons learned from initial 50 cases.
Cabalag, Miguel S; Mann, G Bruce; Gorelik, Alexandra; Miller, Julie A
2015-06-01
Posterior retroperitoneoscopic adrenalectomy (PRA) is an alternative approach to minimally invasive adrenalectomy, potentially offering less pain and faster recovery compared with laparoscopic transperitoneal adrenalectomy (LA). The authors have recently changed from LA to PRA in suitable patients and audited their first 50 cases. Data were prospectively collected for 50 consecutive PRAs performed by the same surgeon. Patient demographics, tumour characteristics, analgesia use, operative and preparation time, length of stay, and complications were recorded. Fifty adrenalectomies were performed in 49 patients. The median (range) age was 58.5 years (30-83) and the majority of patients were female (n = 33, 66.0%). The median (interquartile range (IQR)) preparation time was 35.5 (28.5-50.0) and median operation time was 70.5 (54-85) min, which decreased during the study period. After a learning curve of 15 cases, median operative time reached 61 min. PRA patients required minimal post-operative analgesia, with a median (IQR) of 0 (0-5) mg of intravenous morphine equivalent used. The median (IQR) length of stay was 1 (1-1) day, with 8 (16.0%) same-day discharges. There were four complications: one blood pressure lability from a phaeochromocytoma, one reintubation, one self-limited bleed and one temporary subcostal neuropraxia. There were no conversions to open surgery or deaths. Our results support previously published findings that PRA is a safe procedure, with a relatively short learning curve, resulting in minimal post-operative analgesia use and short length of hospital stay. © 2014 Royal Australasian College of Surgeons.
Probabilistic Risk Model for Organ Doses and Acute Health Effects of Astronauts on Lunar Missions
NASA Technical Reports Server (NTRS)
Kim, Myung-Hee Y.; Hu, Shaowen; Nounu, Hatem N.; Cucinotta, Francis A.
2009-01-01
Exposure to large solar particle events (SPEs) is a major concern during EVAs on the lunar surface and in Earth-to-Lunar transit. 15% of crew times may be on EVA with minimal radiation shielding. Therefore, an accurate assessment of SPE occurrence probability is required for the mission planning by NASA. We apply probabilistic risk assessment (PRA) for radiation protection of crews and optimization of lunar mission planning.
Glass ionomer-silver cermet Class II tunnel-restorations for primary molars.
Croll, T P
1988-01-01
Tunnel preparations preserve the anatomical marginal ridge and minimize the loss of healthy tooth structure adjacent to the carious lesion. When the practitioner has developed proficiency in restoring class II carious lesions with tunnel restorations, less treatment time is required than with traditional class II preparations. The technique for restoring a primary first molar with a class II carious lesion, using a tunnel preparation and Ketac-Silver restorative material is described.
1988-03-01
respect, they are considerably unlike (J. Ames, 1984, WDF; pers. comm.). In sockeye, Oncorhynchus nerka , coho, Southern British Columbia, the average...NOMENCLATURE/TAXONOMY/RANGE MORPHOLOGY/IDENTIFICATION AIDS Scientific name.. Oncorhynchus keta Dorsal fin 10-13 rays; adipose . (Walbaum 1792) (Figure 1) fin... Oncorhynchus gorbuscha, spend minimal salmon that spawn in large river 1, time rearing in freshwater. In this systems is sometimes twice that long f
ERIC Educational Resources Information Center
Nolan, Carson Y., Ed.
The second of a three-volume final report presents results of three studies on indexing systems for tape recordings used by blind persons. Study I is explained to have compared five tonal index codes in order to identify a code that required minimal display time, that had easily discriminable characters, and that could be easily learned. Results…
Performance of an 8 kW Hall Thruster
2000-01-12
For the purpose of either orbit raising and/or repositioning the Hall thruster must be capable of delivering sufficient thrust to minimize transfer...time. This coupled with the increasing on-board electric power capacity of military and commercial satellites, requires a high power Hall thruster that...development of a novel, high power Hall thruster , capable of efficient operation over a broad range of Isp and thrust. We call such a thruster the bi
Holyfield, Christine; Drager, Kathryn; Light, Janice; Caron, Jessica Gosnell
2017-08-15
Augmentative and alternative communication (AAC) promotes communicative participation and language development for young children with complex communication needs. However, the motor, linguistic, and cognitive demands of many AAC technologies restrict young children's operational use of and influence over these technologies. The purpose of the current study is to better understand young children's participation in programming vocabulary "just in time" on an AAC application with minimized demands. A descriptive study was implemented to highlight the participation of 10 typically developing toddlers (M age: 16 months, range: 10-22 months) in just-in-time vocabulary programming in an AAC app with visual scene displays. All 10 toddlers participated in some capacity in adding new visual scene displays and vocabulary to the app just in time. Differences in participation across steps were observed, suggesting variation in the developmental demands of controls involved in vocabulary programming. Results from the current study provide clinical insights toward involving young children in AAC programming just in time and steps that may allow for more independent participation or require more scaffolding. Technology designed to minimize motor, cognitive, and linguistic demands may allow children to participate in programming devices at a younger age.
An Efficient Randomized Algorithm for Real-Time Process Scheduling in PicOS Operating System
NASA Astrophysics Data System (ADS)
Helmy*, Tarek; Fatai, Anifowose; Sallam, El-Sayed
PicOS is an event-driven operating environment designed for use with embedded networked sensors. More specifically, it is designed to support the concurrency in intensive operations required by networked sensors with minimal hardware requirements. Existing process scheduling algorithms of PicOS; a commercial tiny, low-footprint, real-time operating system; have their associated drawbacks. An efficient, alternative algorithm, based on a randomized selection policy, has been proposed, demonstrated, confirmed for efficiency and fairness, on the average, and has been recommended for implementation in PicOS. Simulations were carried out and performance measures such as Average Waiting Time (AWT) and Average Turn-around Time (ATT) were used to assess the efficiency of the proposed randomized version over the existing ones. The results prove that Randomized algorithm is the best and most attractive for implementation in PicOS, since it is most fair and has the least AWT and ATT on average over the other non-preemptive scheduling algorithms implemented in this paper.
Optimal processor assignment for pipeline computations
NASA Technical Reports Server (NTRS)
Nicol, David M.; Simha, Rahul; Choudhury, Alok N.; Narahari, Bhagirath
1991-01-01
The availability of large scale multitasked parallel architectures introduces the following processor assignment problem for pipelined computations. Given a set of tasks and their precedence constraints, along with their experimentally determined individual responses times for different processor sizes, find an assignment of processor to tasks. Two objectives are of interest: minimal response given a throughput requirement, and maximal throughput given a response time requirement. These assignment problems differ considerably from the classical mapping problem in which several tasks share a processor; instead, it is assumed that a large number of processors are to be assigned to a relatively small number of tasks. Efficient assignment algorithms were developed for different classes of task structures. For a p processor system and a series parallel precedence graph with n constituent tasks, an O(np2) algorithm is provided that finds the optimal assignment for the response time optimization problem; it was found that the assignment optimizing the constrained throughput in O(np2log p) time. Special cases of linear, independent, and tree graphs are also considered.
Wavefront optimized nonlinear microscopy of ex vivo human retinas
NASA Astrophysics Data System (ADS)
Gualda, Emilio J.; Bueno, Juan M.; Artal, Pablo
2010-03-01
A multiphoton microscope incorporating a Hartmann-Shack (HS) wavefront sensor to control the ultrafast laser beam's wavefront aberrations has been developed. This instrument allowed us to investigate the impact of the laser beam aberrations on two-photon autofluorescence imaging of human retinal tissues. We demonstrated that nonlinear microscopy images are improved when laser beam aberrations are minimized by realigning the laser system cavity while wavefront controlling. Nonlinear signals from several human retinal anatomical features have been detected for the first time, without the need of fixation or staining procedures. Beyond the improved image quality, this approach reduces the required excitation power levels, minimizing the side effects of phototoxicity within the imaged sample. In particular, this may be important to study the physiology and function of the healthy and diseased retina.
Morselli, Paolo Giovanni; Micai, Alessandro; Giorgini, Federico Armando
2016-08-01
The "Lull pgm system" is a closed system for purifying harvested fat. It processes the collected tissue safely without any additional cost. The system was conceived by referring to the targets described in the literature with the aim of creating a simple system that guarantees a high standard of purification and requires minimal equipment that is available in every operating room. Cost must be always considered: even the most prosperous hospitals must keep within tight annual budgets. "Lull" can be used instead of expensive devices or disposable kits, without substantially increasing the operating time. The system has been used in clinical practice for many plastic reconstructive procedures and has obtained positive results and patient satisfaction, and no contraindications or disadvantages have been observed.
Nallasivam, Ulaganathan; Shah, Vishesh H.; Shenvi, Anirudh A.; ...
2016-02-10
We present a general Global Minimization Algorithm (GMA) to identify basic or thermally coupled distillation configurations that require the least vapor duty under minimum reflux conditions for separating any ideal or near-ideal multicomponent mixture into a desired number of product streams. In this algorithm, global optimality is guaranteed by modeling the system using Underwood equations and reformulating the resulting constraints to bilinear inequalities. The speed of convergence to the globally optimal solution is increased by using appropriate feasibility and optimality based variable-range reduction techniques and by developing valid inequalities. As a result, the GMA can be coupled with already developedmore » techniques that enumerate basic and thermally coupled distillation configurations, to provide for the first time, a global optimization based rank-list of distillation configurations.« less
43 CFR 3272.12 - What environmental protection measures must I include in my utilization plan?
Code of Federal Regulations, 2014 CFR
2014-10-01
... resources; (6) Minimize air and noise pollution; and (7) Minimize hazards to public health and safety during... operations to ensure that they comply with the requirements of § 3200.4, and applicable noise, air, and water... may require you to collect data concerning existing air and water quality, noise, seismicity...
43 CFR 3272.12 - What environmental protection measures must I include in my utilization plan?
Code of Federal Regulations, 2013 CFR
2013-10-01
... resources; (6) Minimize air and noise pollution; and (7) Minimize hazards to public health and safety during... operations to ensure that they comply with the requirements of § 3200.4, and applicable noise, air, and water... may require you to collect data concerning existing air and water quality, noise, seismicity...
43 CFR 3272.12 - What environmental protection measures must I include in my utilization plan?
Code of Federal Regulations, 2012 CFR
2012-10-01
... resources; (6) Minimize air and noise pollution; and (7) Minimize hazards to public health and safety during... operations to ensure that they comply with the requirements of § 3200.4, and applicable noise, air, and water... may require you to collect data concerning existing air and water quality, noise, seismicity...
Minimally invasive video-assisted thyroidectomy: Ascending the learning curve
Capponi, Michela Giulii; Bellotti, Carlo; Lotti, Marco; Ansaloni, Luca
2015-01-01
BACKGROUND: Minimally invasive video-assisted thyroidectomy (MIVAT) is a technically demanding procedure and requires a surgical team skilled in both endocrine and endoscopic surgery. The aim of this report is to point out some aspects of the learning curve of the video-assisted thyroid surgery, through the analysis of our preliminary series of procedures. PATIENTS AND METHODS: Over a period of 8 months, we selected 36 patients for minimally invasive video-assisted surgery of the thyroid. The patients were considered eligible if they presented with a nodule not exceeding 35 mm and total thyroid volume <20 ml; presence of biochemical and ultrasound signs of thyroiditis and pre-operative diagnosis of cancer were exclusion criteria. We analysed surgical results, conversion rate, operating time, post-operative complications, hospital stay and cosmetic outcomes of the series. RESULTS: We performed 36 total thyroidectomy and in one case we performed a consensual parathyroidectomy. The procedure was successfully carried out in 33 out of 36 cases (conversion rate 8.3%). The mean operating time was 109 min (range: 80-241 min) and reached a plateau after 29 MIVAT. Post-operative complications included three transient recurrent nerve palsies and two transient hypocalcemias; no definitive hypoparathyroidism was registered. The cosmetic result was considered excellent by most patients. CONCLUSIONS: Advances in skills and technology allow surgeons to easily reproduce the standard open total thyroidectomy with video-assistance. Although the learning curve represents a time-consuming step, training remains a crucial point in gaining a reasonable confidence with video-assisted surgical technique. PMID:25883451
Radiation equivalent dose simulations for long-term interplanetary flights
NASA Astrophysics Data System (ADS)
Dobynde, M. I.; Drozdov, A.; Shprits, Y. Y.
2016-12-01
Cosmic particle radiation is a limiting factor for the human interplanetary flights. The unmanned flights inside heliosphere and human flights inside of magnetosphere tend to become a routine procedure, whereas there have been only few shot term human flights out of it (Apollo missions 1969-1972) with maximum duration less than a month. Long-term human flights set much higher requirements to the radiation shielding, primarily because of long exposition to cosmic radiation. Inside the helosphere there are two main sources of cosmic radiation: galactic cosmic rays (GCR) and soalr particle events (SPE). GCR come from the outside of heliosphere forming a background of overall radiation that affects the spacecraft. The intensity of GCR is varied according to solar activity, increasing with solar activity decrease and backward, with the modulation time (time between nearest maxima) of 11 yeas. SPE are shot term events, comparing to GCR modulation time, but particle fluxes are much more higher. The probability of SPE increases with the increase of solar activity. Time dependences of the intensity of these two components encourage looking for a time window of flight, when intensity and effect of GCR and SPE would be minimized. Combining GEANT4 Monte Carlo simulations with time dependent model of GCR spectra and data on SPE spectra we show the time dependence of the radiation dose in an anthropomorphic human phantom inside the shielding capsule. Different types of particles affect differently on the human providing more or less harm to the tissues. We use quality factors to recalculate absorbed dose into biological equivalent dose, which give more information about risks for astronaut's health. Incident particles provide a large amount of secondary particles while propagating through the shielding capsule. We try to find an optimal combination of shielding material and thickness, that will effectively decrease the incident particle energy, at the same time minimizing flow of secondary induced particles and minimizing most harmful particle types flows.
[Focus Notified Bodies. New requirements for designation and monitoring].
Poos, U; Edelhäuser, R
2014-12-01
For medical devices with a higher risk, Notified Bodies assess whether the manufacturers and their products fulfill the requirements laid down in the European directives on medical devices. Notified Bodies are designated through a designation procedure by the designating authority, in Germany by ZLG. The requirements for the designation arise from the respective annexes of the directives on medical devices. Since these are only minimal criteria, different documents have been compiled on a European and national level to concretize these minimal criteria regarding the organization, quality management system, resources, and certification procedure. The rules of the ZLG are thereby the essential documents for designation in Germany. Moreover, according to Implementing Regulation (EU) no. 912/2013, the European commission and the other European designating authorities also have to be involved in the designation process. The aim of continuous monitoring of the Notified Bodies with assessments on the bodies' premises as well as with observed audits is to ensure the permanent fulfillment of the requirements. If nonconformities are found in a body's quality management system or in its implementation of the conformity assessment procedures, the body is obliged to provide ZLG with a corrective actions plan. In the case that the nonconformities are not resolved in time or critical nonconformities are found, ZLG may take actions, e.g., restrict the scope of designation, suspend, or - as last resort - withdraw the designation.
Piezo-Operated Shutter Mechanism Moves 1.5 cm
NASA Technical Reports Server (NTRS)
Glaser, Robert; Bamford, Robert
2005-01-01
The figure shows parts of a shutter mechanism designed to satisfy a number of requirements specific to its original intended application as a component of an atomic clock to be flown in outer space. The mechanism may also be suitable for use in laboratory and industrial vacuum systems on Earth for which there are similar requirements. The requirements include the following: a) To alternately close, then open, a 1.5-cm-diameter optical aperture twice per second, with a stroke time of no more than 15 ms, during a total operational lifetime of at least a year; b) To attenuate light by a factor of at least 1012 when in the closed position; c) To generate little or no magnetic field; d) To be capable of withstanding bakeout at a temperature of 200 C to minimize outgassing during subsequent operation in an ultrahigh vacuum; and e) To fit within a diameter of 12 in. (=305 mm) a size limit dictated by the size of an associated magnetic shield. The light-attenuation requirement is satisfied by use of overlapping shutter blades. The closure of the aperture involves, among other things, insertion of a single shutter blade between a pair of shutter blades. The requirement to minimize the magnetic field is satisfied by use of piezoelectric actuators. Because piezoelectric actuators cannot withstand bakeout, they must be mounted outside the vacuum chamber, and, hence, motion must be transmitted from the actuators to the shutter levers via a vacuum-chamber-wall diaphragm.
TDRSS telecommunications system, PN code analysis
NASA Technical Reports Server (NTRS)
Dixon, R.; Gold, R.; Kaiser, F.
1976-01-01
The pseudo noise (PN) codes required to support the TDRSS telecommunications services are analyzed and the impact of alternate coding techniques on the user transponder equipment, the TDRSS equipment, and all factors that contribute to the acquisition and performance of these telecommunication services is assessed. Possible alternatives to the currently proposed hybrid FH/direct sequence acquisition procedures are considered and compared relative to acquisition time, implementation complexity, operational reliability, and cost. The hybrid FH/direct sequence technique is analyzed and rejected in favor of a recommended approach which minimizes acquisition time and user transponder complexity while maximizing probability of acquisition and overall link reliability.
Upwind relaxation methods for the Navier-Stokes equations using inner iterations
NASA Technical Reports Server (NTRS)
Taylor, Arthur C., III; Ng, Wing-Fai; Walters, Robert W.
1992-01-01
A subsonic and a supersonic problem are respectively treated by an upwind line-relaxation algorithm for the Navier-Stokes equations using inner iterations to accelerate steady-state solution convergence and thereby minimize CPU time. While the ability of the inner iterative procedure to mimic the quadratic convergence of the direct solver method is attested to in both test problems, some of the nonquadratic inner iterative results are noted to have been more efficient than the quadratic. In the more successful, supersonic test case, inner iteration required only about 65 percent of the line-relaxation method-entailed CPU time.
Virtual Ultrasound Guidance for Inexperienced Operators
NASA Technical Reports Server (NTRS)
Caine, Timothy; Martin, Davis
2012-01-01
Medical ultrasound or echocardiographic studies are highly operator-dependent and generally require lengthy training and internship to perfect. To obtain quality echocardiographic images in remote environments, such as on-orbit, remote guidance of studies has been employed. This technique involves minimal training for the user, coupled with remote guidance from an expert. When real-time communication or expert guidance is not available, a more autonomous system of guiding an inexperienced operator through an ultrasound study is needed. One example would be missions beyond low Earth orbit, in which the time delay inherent with communication will make remote guidance impractical.
Okamura, Naomi; Kobayashi, Yo; Sugano, Shigeki; Fujie, Masakatsu G
2017-07-01
Static stretching is widely performed to decrease muscle tone as a part of rehabilitation protocols. Finding out the optimal duration of static stretching is important to minimize the time required for rehabilitation therapy and it would be helpful for maintaining the patient's motivation towards daily rehabilitation tasks. Several studies have been conducted for the evaluation of static stretching; however, the recommended duration of static stretching varies widely between 15-30 s in general, because the traditional methods for the assessment of muscle tone do not monitor the continuous change in the target muscle's state. We have developed a method to monitor the viscoelasticity of one muscle continuously during static stretching, using a wearable indentation tester. In this study, we investigated a suitable signal processing method to detect the time required to change the muscle tone, utilizing the data collected using a wearable indentation tester. By calculating a viscoelastic index with a certain time window, we confirmed that the stretching duration required to bring about a decrease in muscle tone could be obtained with an accuracy in the order of 1 s.
Cognitive performance in women with fibromyalgia: A case-control study.
Pérez de Heredia-Torres, Marta; Huertas-Hoyas, Elisabet; Máximo-Bocanegra, Nuria; Palacios-Ceña, Domingo; Fernández-De-Las-Peñas, César
2016-10-01
This study aimed to evaluate the differences in cognitive skills between women with fibromyalgia and healthy women, and the correlations between functional independence and cognitive limitations. A cross-sectional study was performed. Twenty women with fibromyalgia and 20 matched controls participated. Outcomes included the Numerical Pain Rating Scale, the Functional Independence Measure, the Fibromyalgia Impact Questionnaire and Gradior © software. The Student's t-test and the Spearman's rho test were applied to the data. Women affected required a greater mean time (P < 0.020) and maximum time (P < 0.015) during the attention test than the healthy controls. In the memory test they displayed greater execution errors (P < 0.001), minimal time (P < 0.001) and mean time (P < 0.001) whereas, in the perception tests, they displayed a greater mean time (P < 0.009) and maximum time (P < 0.048). Correlations were found between the domains of the functional independence measure and the cognitive abilities assessed. Women with fibromyalgia exhibited a decreased cognitive ability compared to healthy controls, which negatively affected the performance of daily activities, such as upper limb dressing, feeding and personal hygiene. Patients required more time to perform activities requiring both attention and perception, decreasing their functional independence. Also, they displayed greater errors when performing activities requiring the use of memory. Occupational therapists treating women with fibromyalgia should consider the negative impact of possible cognitive deficits on the performance of daily activities and offer targeted support strategies. © 2016 Occupational Therapy Australia.
A Framework for Robust Multivariable Optimization of Integrated Circuits in Space Applications
NASA Technical Reports Server (NTRS)
DuMonthier, Jeffrey; Suarez, George
2013-01-01
Application Specific Integrated Circuit (ASIC) design for space applications involves multiple challenges of maximizing performance, minimizing power and ensuring reliable operation in extreme environments. This is a complex multidimensional optimization problem which must be solved early in the development cycle of a system due to the time required for testing and qualification severely limiting opportunities to modify and iterate. Manual design techniques which generally involve simulation at one or a small number of corners with a very limited set of simultaneously variable parameters in order to make the problem tractable are inefficient and not guaranteed to achieve the best possible results within the performance envelope defined by the process and environmental requirements. What is required is a means to automate design parameter variation, allow the designer to specify operational constraints and performance goals, and to analyze the results in a way which facilitates identifying the tradeoffs defining the performance envelope over the full set of process and environmental corner cases. The system developed by the Mixed Signal ASIC Group (MSAG) at the Goddard Space Flight Center is implemented as framework of software modules, templates and function libraries. It integrates CAD tools and a mathematical computing environment, and can be customized for new circuit designs with only a modest amount of effort as most common tasks are already encapsulated. Customization is required for simulation test benches to determine performance metrics and for cost function computation. Templates provide a starting point for both while toolbox functions minimize the code required. Once a test bench has been coded to optimize a particular circuit, it is also used to verify the final design. The combination of test bench and cost function can then serve as a template for similar circuits or be re-used to migrate the design to different processes by re-running it with the new process specific device models. The system has been used in the design of time to digital converters for laser ranging and time-of-flight mass spectrometry to optimize analog, mixed signal and digital circuits such as charge sensitive amplifiers, comparators, delay elements, radiation tolerant dual interlocked (DICE) flip-flops and two of three voter gates.
Dhir, Ashish; Rogawski, Michael A
2018-05-01
Diazepam, administered by the intravenous, oral, or rectal routes, is widely used for the management of acute seizures. Dosage forms for delivery of diazepam by other routes of administration, including intranasal, intramuscular, and transbuccal, are under investigation. In predicting what dosages are necessary to terminate seizures, the minimal exposure required to confer seizure protection must be known. Here we administered diazepam by continuous intravenous infusion to obtain near-steady-state levels, which allowed an assessment of the minimal levels that elevate seizure threshold. The thresholds for various behavioral seizure signs (myoclonic jerk, clonus, and tonus) were determined with the timed intravenous pentylenetetrazol seizure threshold test in rats. Diazepam was administered to freely moving animals by continuous intravenous infusion via an indwelling jugular vein cannula. Blood samples for assay of plasma levels of diazepam and metabolites were recovered via an indwelling cannula in the contralateral jugular vein. The pharmacokinetic parameters of diazepam following a single 80-μg/kg intravenous bolus injection were determined using a noncompartmental pharmacokinetic approach. The derived parameters V d , CL, t 1/2α (distribution half-life) and t 1/2β (terminal half-life) for diazepam were, respectively, 608 mL, 22.1 mL/min, 13.7 minutes, and 76.8 minutes, respectively. Various doses of diazepam were continuously infused without or with an initial loading dose. At the end of the infusions, the thresholds for various behavioral seizure signs were determined. The minimal plasma diazepam concentration associated with threshold elevations was estimated at approximately 70 ng/mL. The active metabolites nordiazepam, oxazepam, and temazepam achieved levels that are expected to make only minor contributions to the threshold elevations. Diazepam elevates seizure threshold at steady-state plasma concentrations lower than previously recognized. The minimally effective plasma concentration provides a reference that may be considered when estimating the diazepam exposure required for acute seizure treatment. Wiley Periodicals, Inc. © 2018 International League Against Epilepsy.
An autonomous payload controller for the Space Shuttle
NASA Technical Reports Server (NTRS)
Hudgins, J. I.
1979-01-01
The Autonomous Payload Control (APC) system discussed in the present paper was designed on the basis of such criteria as minimal cost of implementation, minimal space required in the flight-deck area, simple operation with verification of the results, minimal additional weight, minimal impact on Orbiter design, and minimal impact on Orbiter payload integration. In its present configuration, the APC provides a means for the Orbiter crew to control as many as 31 autononous payloads. The avionics and human engineering aspects of the system are discussed.
Evolutionary Bi-objective Optimization for Bulldozer and Its Blade in Soil Cutting
NASA Astrophysics Data System (ADS)
Sharma, Deepak; Barakat, Nada
2018-02-01
An evolutionary optimization approach is adopted in this paper for simultaneously achieving the economic and productive soil cutting. The economic aspect is defined by minimizing the power requirement from the bulldozer, and the soil cutting is made productive by minimizing the time of soil cutting. For determining the power requirement, two force models are adopted from the literature to quantify the cutting force on the blade. Three domain-specific constraints are also proposed, which are limiting the power from the bulldozer, limiting the maximum force on the bulldozer blade and achieving the desired production rate. The bi-objective optimization problem is solved using five benchmark multi-objective evolutionary algorithms and one classical optimization technique using the ɛ-constraint method. The Pareto-optimal solutions are obtained with the knee-region. Further, the post-optimal analysis is performed on the obtained solutions to decipher relationships among the objectives and decision variables. Such relationships are later used for making guidelines for selecting the optimal set of input parameters. The obtained results are then compared with the experiment results from the literature that show a close agreement among them.
NASA Technical Reports Server (NTRS)
Williams, J. L.; Copeland, R. J.; Nebbon, B. W.
1972-01-01
The most promising closed CO2 control concept identified by this study is the solid pellet, Mg(OH2)2 system. Two promising approaches to closed thermal control were identified. The AHS system uses modular fusible heat sinks, with a contingency evaporative mode, to allow maximum EVA mobility. The AHS/refrigerator top-off subsystem requires an umbilical to minimize expendables, but less EVA time is used to operate the system, since there is no requirement to change modules. Both of these subsystems are thought to be practical solutions to the problem of providing closed heat rejection for an EVA system.
Detailed requirements document for common software of shuttle program information management system
NASA Technical Reports Server (NTRS)
Everette, J. M.; Bradfield, L. D.; Horton, C. L.
1975-01-01
Common software was investigated as a method for minimizing development and maintenance cost of the shuttle program information management system (SPIMS) applications while reducing the time-frame of their development. Those requirements satisfying these criteria are presented along with the stand-alone modules which may be used directly by applications. The SPIMS applications operating on the CYBER 74 computer, are specialized information management systems which use System 2000 as a data base manager. Common software provides the features to support user interactions on a CRT terminal using form input and command response capabilities. These features are available as subroutines to the applications.
A human performance modelling approach to intelligent decision support systems
NASA Technical Reports Server (NTRS)
Mccoy, Michael S.; Boys, Randy M.
1987-01-01
Manned space operations require that the many automated subsystems of a space platform be controllable by a limited number of personnel. To minimize the interaction required of these operators, artificial intelligence techniques may be applied to embed a human performance model within the automated, or semi-automated, systems, thereby allowing the derivation of operator intent. A similar application has previously been proposed in the domain of fighter piloting, where the demand for pilot intent derivation is primarily a function of limited time and high workload rather than limited operators. The derivation and propagation of pilot intent is presented as it might be applied to some programs.
NASA Technical Reports Server (NTRS)
Quilligan, Gerard; DeMonthier, Jeffrey; Suarez, George
2011-01-01
This innovation addresses challenges in lidar imaging, particularly with the detection scheme and the shapes of the detected signals. Ideally, the echoed pulse widths should be extremely narrow to resolve fine detail at high event rates. However, narrow pulses require wideband detection circuitry with increased power dissipation to minimize thermal noise. Filtering is also required to shape each received signal into a form suitable for processing by a constant fraction discriminator (CFD) followed by a time-to-digital converter (TDC). As the intervals between the echoes decrease, the finite bandwidth of the shaping circuits blends the pulses into an analog signal (luminance) with multiple modes, reducing the ability of the CFD to discriminate individual events
LST and instrument considerations. [modular design
NASA Technical Reports Server (NTRS)
Levin, G. M.
1974-01-01
In order that the LST meet its scientific objectives and also be a National Astronomical Space Facility during the 1980's and 1990's, broad requirements have been levied by the scientific community. These scientific requirements can be directly translated into design requirements and specifications for the scientific instruments. The instrument ensemble design must be consistent with a 15-year operational lifetime. Downtime for major repair/refurbishment or instrument updating must be minimized. The overall efficiency and performance of the instruments should be maximized. Modularization of instruments and instrument subsystems, some degree of on-orbit servicing (both repair and replacement), on-axis location, minimizing the number of reflections within instruments, minimizing polarization effects, and simultaneous operation of the F/24 camera with other instruments, are just a few of the design guidelines and specifications which can and will be met in order that these broader scientific requirements be satisfied.-
Automated Testing Experience of the Linear Aerospike SR-71 Experiment (LASRE) Controller
NASA Technical Reports Server (NTRS)
Larson, Richard R.
1999-01-01
System controllers must be fail-safe, low cost, flexible to software changes, able to output health and status words, and permit rapid retest qualification. The system controller designed and tested for the aerospike engine program was an attempt to meet these requirements. This paper describes (1) the aerospike controller design, (2) the automated simulation testing techniques, and (3) the real time monitoring data visualization structure. Controller cost was minimized by design of a single-string system that used an off-the-shelf 486 central processing unit (CPU). A linked-list architecture, with states (nodes) defined in a user-friendly state table, accomplished software changes to the controller. Proven to be fail-safe, this system reported the abort cause and automatically reverted to a safe condition for any first failure. A real time simulation and test system automated the software checkout and retest requirements. A program requirement to decode all abort causes in real time during all ground and flight tests assured the safety of flight decisions and the proper execution of mission rules. The design also included health and status words, and provided a real time analysis interpretation for all health and status data.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jones, Christian Birk; Robinson, Matt; Yasaei, Yasser
Optimal integration of thermal energy storage within commercial building applications requires accurate load predictions. Several methods exist that provide an estimate of a buildings future needs. Methods include component-based models and data-driven algorithms. This work implemented a previously untested algorithm for this application that is called a Laterally Primed Adaptive Resonance Theory (LAPART) artificial neural network (ANN). The LAPART algorithm provided accurate results over a two month period where minimal historical data and a small amount of input types were available. These results are significant, because common practice has often overlooked the implementation of an ANN. ANN have often beenmore » perceived to be too complex and require large amounts of data to provide accurate results. The LAPART neural network was implemented in an on-line learning manner. On-line learning refers to the continuous updating of training data as time occurs. For this experiment, training began with a singe day and grew to two months of data. This approach provides a platform for immediate implementation that requires minimal time and effort. The results from the LAPART algorithm were compared with statistical regression and a component-based model. The comparison was based on the predictions linear relationship with the measured data, mean squared error, mean bias error, and cost savings achieved by the respective prediction techniques. The results show that the LAPART algorithm provided a reliable and cost effective means to predict the building load for the next day.« less
Powered Descent Guidance with General Thrust-Pointing Constraints
NASA Technical Reports Server (NTRS)
Carson, John M., III; Acikmese, Behcet; Blackmore, Lars
2013-01-01
The Powered Descent Guidance (PDG) algorithm and software for generating Mars pinpoint or precision landing guidance profiles has been enhanced to incorporate thrust-pointing constraints. Pointing constraints would typically be needed for onboard sensor and navigation systems that have specific field-of-view requirements to generate valid ground proximity and terrain-relative state measurements. The original PDG algorithm was designed to enforce both control and state constraints, including maximum and minimum thrust bounds, avoidance of the ground or descent within a glide slope cone, and maximum speed limits. The thrust-bound and thrust-pointing constraints within PDG are non-convex, which in general requires nonlinear optimization methods to generate solutions. The short duration of Mars powered descent requires guaranteed PDG convergence to a solution within a finite time; however, nonlinear optimization methods have no guarantees of convergence to the global optimal or convergence within finite computation time. A lossless convexification developed for the original PDG algorithm relaxed the non-convex thrust bound constraints. This relaxation was theoretically proven to provide valid and optimal solutions for the original, non-convex problem within a convex framework. As with the thrust bound constraint, a relaxation of the thrust-pointing constraint also provides a lossless convexification that ensures the enhanced relaxed PDG algorithm remains convex and retains validity for the original nonconvex problem. The enhanced PDG algorithm provides guidance profiles for pinpoint and precision landing that minimize fuel usage, minimize landing error to the target, and ensure satisfaction of all position and control constraints, including thrust bounds and now thrust-pointing constraints.
Image guidance systems for minimally invasive sinus and skull base surgery in children.
Benoit, Margo McKenna; Silvera, V Michelle; Nichollas, Richard; Jones, Dwight; McGill, Trevor; Rahbar, Reza
2009-10-01
The use of image guidance for sinonasal and skull base surgery has been well-characterized in adults but there is limited information on the use of these systems in the pediatric population, despite their widespread use. The aim of this study is to evaluate the use of image guidance systems to facilitate an endoscopic minimally invasive approach to sinonasal and skull base surgery in a pediatric population. A retrospective cohort study was performed at a tertiary pediatric hospital. Thirty-three children presented with complications of sinusitis, tumors, traumatic, or congenital lesions of the skull base and underwent endoscopic surgery using image guidance from March 2000 to April 2007. Patient variables including diagnosis, extent of disease, and complications were extracted from paper and computer charts. Additional surgical variables including set-up time, accuracy, surgeon satisfaction index and number of uses per case were also reviewed. Twenty-eight patients (85%) underwent sinonasal surgery and five (15%) underwent skull base surgery. Indications included infectious complications of acute sinusitis (N=15), neoplasms (N=12), choanal atresia (N=4), and cerebrospinal fluid leak (N=2). Thirty-one patients (94%) required only one procedure. No surgical complications were reported. Surgeon satisfaction, mean accuracy and number of uses per procedure increased over time (p<0.05). Image guidance systems are safe and effective tools that facilitate a minimally invasive approach to sinonasal and skull base surgery in children. Consistent with adult literature, usage and surgeon comfort increased with experience. The additional anatomical information obtained by image guidance systems facilitates a minimally invasive endoscopic approach for sinonasal and skull base pathologies.
Minimally invasive esthetic ridge preservation with growth-factor enhanced bone matrix.
Nevins, Marc L; Said, Sherif
2017-12-28
Extraction socket preservation procedures are critical to successful esthetic implant therapy. Conventional surgical approaches are technique sensitive and often result in alteration of the soft tissue architecture, which then requires additional corrective surgical procedures. This case series report presents the ability of flapless surgical techniques combined with a growth factor-enhanced bone matrix to provide esthetic ridge preservation at the time of extraction for compromised sockets. When considering esthetic dental implant therapy, preservation, or further enhancement of the available tissue support at the time of tooth extraction may provide an improved esthetic outcome with reduced postoperative sequelae and decreased treatment duration. Advances in minimally invasive surgical techniques combined with recombinant growth factor technology offer an alternative for bone reconstruction while maintaining the gingival architecture for enhanced esthetic outcome. The combination of freeze-dried bone allograft (FDBA) and rhPDGF-BB (platelet-derived growth factor-BB) provides a growth-factor enhanced matrix to induce bone and soft tissue healing. The use of a growth-factor enhanced matrix is an option for minimally invasive ridge preservation procedures for sites with advanced bone loss. Further studies including randomized clinical trials are needed to better understand the extent and limits of these procedures. The use of minimally invasive techniques with growth factors for esthetic ridge preservation reduces patient morbidity associated with more invasive approaches and increases the predictability for enhanced patient outcomes. By reducing the need for autogenous bone grafts the use of this technology is favorable for patient acceptance and ease of treatment process for esthetic dental implant therapy. © 2017 Wiley Periodicals, Inc.
Reineke, Lucas C; Merrick, William C
2009-12-01
Cap-independent initiation of translation is thought to promote protein synthesis on some mRNAs during times when cap-dependent initiation is down-regulated. However, the mechanism of cap-independent initiation is poorly understood. We have previously reported the secondary structure within the yeast minimal URE2 IRES element. In this study, we sought to investigate the mechanism of internal initiation in yeast by assessing the functional role of nucleotides within the minimal URE2 IRES element, and delineating the cis-sequences that modulate levels of internal initiation using a monocistronic reporter vector. Furthermore, we compared the eIF2A sensitivity of the URE2 IRES element with some of the invasive growth IRES elements using DeltaeIF2A yeast. We found that the stability of the stem-loop structure within the minimal URE2 IRES element is not a critical determinant of optimal IRES activity, and the downstream sequences that modulate URE2 IRES-mediated translation can be defined to discrete regions within the URE2 coding region. Repression of internal initiation on the URE2 minimal IRES element by eIF2A is not dependent on the stability of the secondary structure within the URE2 IRES element. Our data also indicate that eIF2A-mediated repression is not specific to the URE2 IRES element, as both the GIC1 and PAB1 IRES elements are repressed by eIF2A. These data provide valuable insights into the mRNA requirements for internal initiation in yeast, and insights into the mechanism of eIF2A-mediated suppression.
NASA Astrophysics Data System (ADS)
Rodriguez-Pretelin, A.; Nowak, W.
2017-12-01
For most groundwater protection management programs, Wellhead Protection Areas (WHPAs) have served as primarily protection measure. In their delineation, the influence of time-varying groundwater flow conditions is often underestimated because steady-state assumptions are commonly made. However, it has been demonstrated that temporary variations lead to significant changes in the required size and shape of WHPAs. Apart from natural transient groundwater drivers (e.g., changes in the regional angle of flow direction and seasonal natural groundwater recharge), anthropogenic causes such as transient pumping rates are of the most influential factors that require larger WHPAs. We hypothesize that WHPA programs that integrate adaptive and optimized pumping-injection management schemes can counter transient effects and thus reduce the additional areal demand in well protection under transient conditions. The main goal of this study is to present a novel management framework that optimizes pumping schemes dynamically, in order to minimize the impact triggered by transient conditions in WHPA delineation. For optimizing pumping schemes, we consider three objectives: 1) to minimize the risk of pumping water from outside a given WHPA, 2) to maximize the groundwater supply and 3) to minimize the involved operating costs. We solve transient groundwater flow through an available transient groundwater and Lagrangian particle tracking model. The optimization problem is formulated as a dynamic programming problem. Two different optimization approaches are explored: I) the first approach aims for single-objective optimization under objective (1) only. The second approach performs multiobjective optimization under all three objectives where compromise pumping rates are selected from the current Pareto front. Finally, we look for WHPA outlines that are as small as possible, yet allow the optimization problem to find the most suitable solutions.
A Concept for Power Cycling the Electronics of CALICE-AHCAL with the Train Structure of ILC
NASA Astrophysics Data System (ADS)
Göottlicher, Peter; The Calice-Collaboration
Particle flow algorithm calorimetry requires high granularity three-dimensional readout. The tight power requirement of 40 μW/channel is reached by enabling readout ASIC currents only during beam delivery, corresponding to a 1% duty cycle. EMI noise caused by current switching needs to be minimized by the power system and this paper presents ideas, simulations and first measurements for minimizing disturbances. A carefully design of circuits, printed circuit boards, grounding scheme and use of floating supplies allows current loops to be closed locally, stabilized voltages and minimal currents in the metal structures.
Author-based journal selection system that helps authors save time in article submission.
Ozturk, Onur; Ileri, Fatih
2018-01-01
Submission to journals takes a lot of time and format related submission requirements vary greatly from one journal to another. Lack of time and motivation in academia reduces scientific outputs and demotivates researchers. Author-based journal selection system (ABJSS) is a platform for pooling manuscripts conceived to minimize the time spent for manuscript submission and to increase scientific output. The system will provide two types of account: "Author" and "Journal Administrator". Each account type will have its own abilities and permissions. The ABJJS system is an ongoing project that will be designed in cooperation with IT experts and academicians and it will be presented to the scientific world as soon as it secures sufficient support.
LiFi based automated shopping assistance application in IoT
NASA Astrophysics Data System (ADS)
Akter, Sharmin; Funke Olanrewaju, Rashidah, Dr; Islam, Thouhedul; Salma
2018-05-01
Urban people minimize shopping time in daily life due to time constrain. From that point of view, the concept of supermarket is being popular while consumers can buy different items from same place. However, customer spends hours and hours to find desired items in a large supermarket. In addition, it’s also required to be queued during payment at counter that is also time consuming. As a result, a customer has to spend 2-3 hours for shopping in a large superstore. This paper proposes an Internet of Things and Li-Fi based automated application for smart phone and web to find items easily during shopping that can save consumer’s time as well as reduce man power in supermarket.
Spacelab Mission Implementation Cost Assessment (SMICA)
NASA Technical Reports Server (NTRS)
Guynes, B. V.
1984-01-01
A total savings of approximately 20 percent is attainable if: (1) mission management and ground processing schedules are compressed; (2) the equipping, staffing, and operating of the Payload Operations Control Center is revised, and (3) methods of working with experiment developers are changed. The development of a new mission implementation technique, which includes mission definition, experiment development, and mission integration/operations, is examined. The Payload Operations Control Center is to relocate and utilize new computer equipment to produce cost savings. Methods of reducing costs by minimizing the Spacelab and payload processing time during pre- and post-mission operation at KSC are analyzed. The changes required to reduce costs in the analytical integration process are studied. The influence of time, requirements accountability, and risk on costs is discussed. Recommendation for cost reductions developed by the Spacelab Mission Implementation Cost Assessment study are listed.
NASA Technical Reports Server (NTRS)
Chin, Alexander W.; Herrera, Claudia Y.; Spivey, Natalie D.; Fladung, William A.; Cloutier, David
2015-01-01
The mass properties of an aerospace vehicle are required by multiple disciplines in the analysis and prediction of flight behavior. Pendulum oscillation methods have been developed and employed for almost a century as a means to measure mass properties. However, these oscillation methods are costly, time consuming, and risky. The NASA Armstrong Flight Research Center has been investigating the Dynamic Inertia Measurement, or DIM method as a possible alternative to oscillation methods. The DIM method uses ground test techniques that are already applied to aerospace vehicles when conducting modal surveys. Ground vibration tests would require minimal additional instrumentation and time to apply the DIM method. The DIM method has been validated on smaller test articles, but has not yet been fully proven on large aerospace vehicles.
NASA Astrophysics Data System (ADS)
Wood, W. T.; Runyan, T. E.; Palmsten, M.; Dale, J.; Crawford, C.
2016-12-01
Natural Gas (primarily methane) and gas hydrate accumulations require certain bio-geochemical, as well as physical conditions, some of which are poorly sampled and/or poorly understood. We exploit recent advances in the prediction of seafloor porosity and heat flux via machine learning techniques (e.g. Random forests and Bayesian networks) to predict the occurrence of gas and subsequently gas hydrate in marine sediments. The prediction (actually guided interpolation) of key parameters we use in this study is a K-nearest neighbor technique. KNN requires only minimal pre-processing of the data and predictors, and requires minimal run-time input so the results are almost entirely data-driven. Specifically we use new estimates of sedimentation rate and sediment type, along with recently derived compaction modeling to estimate profiles of porosity and age. We combined the compaction with seafloor heat flux to estimate temperature with depth and geologic age, which, with estimates of organic carbon, and models of methanogenesis yield limits on the production of methane. Results include geospatial predictions of gas (and gas hydrate) accumulations, with quantitative estimates of uncertainty. The Generic Earth Modeling System (GEMS) we have developed to derive the machine learning estimates is modular and easily updated with new algorithms or data.
Aligning PEV Charging Times with Electricity Supply and Demand
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hodge, Cabell
Plug-in electric vehicles (PEVs) are a growing source of electricity consumption that could either exacerbate supply shortages or smooth electricity demand curves. Extensive research has explored how vehicle-grid integration (VGI) can be optimized by controlling PEV charging timing or providing vehicle-to-grid (V2G) services, such as storing energy in vehicle batteries and returning it to the grid at peak times. While much of this research has modeled charging, implementation in the real world requires a cost-effective solution that accounts for consumer behavior. To function across different contexts, several types of charging administrators and methods of control are necessary to minimize costsmore » in the VGI context.« less
Digital program for solving the linear stochastic optimal control and estimation problem
NASA Technical Reports Server (NTRS)
Geyser, L. C.; Lehtinen, B.
1975-01-01
A computer program is described which solves the linear stochastic optimal control and estimation (LSOCE) problem by using a time-domain formulation. The LSOCE problem is defined as that of designing controls for a linear time-invariant system which is disturbed by white noise in such a way as to minimize a performance index which is quadratic in state and control variables. The LSOCE problem and solution are outlined; brief descriptions are given of the solution algorithms, and complete descriptions of each subroutine, including usage information and digital listings, are provided. A test case is included, as well as information on the IBM 7090-7094 DCS time and storage requirements.
Analysis of research ethics board approval times in an academic department of medicine.
Tsang, Teresa S M; Jones, Meaghan; Meneilly, Graydon S
2015-04-01
As part of an ongoing effort to better understand barriers to academic research, we reviewed and analyzed the process of research ethics applications, focusing on ethics approval time, within the Department of Medicine from 2006 to 2011. A total of 1,268 applications for approval to use human subjects in research were included in our analysis. Three variables, risk category (minimal vs. non-minimal risk), type of funding, and year of submission, were statistically significant for prediction of ethics approval time, with risk status being the most important of these. The covariate-adjusted mean time for approval for minimal risk studies (35.7 days) was less than half that of non-minimal risk protocols (76.5 days). Studies funded through a for-profit sponsor had significantly longer approval times than those funded through other means but were also predominantly (87%) non-minimal risk protocols. Further investigations of the reasons underlying the observed differences are needed to determine whether improved training for research ethics board (REB) members and/or greater dialogue with investigators may reduce the lengthy approval times associated with non-minimal risk protocols. © The Author(s) 2015.
Blackfolds, plane waves and minimal surfaces
NASA Astrophysics Data System (ADS)
Armas, Jay; Blau, Matthias
2015-07-01
Minimal surfaces in Euclidean space provide examples of possible non-compact horizon geometries and topologies in asymptotically flat space-time. On the other hand, the existence of limiting surfaces in the space-time provides a simple mechanism for making these configurations compact. Limiting surfaces appear naturally in a given space-time by making minimal surfaces rotate but they are also inherent to plane wave or de Sitter space-times in which case minimal surfaces can be static and compact. We use the blackfold approach in order to scan for possible black hole horizon geometries and topologies in asymptotically flat, plane wave and de Sitter space-times. In the process we uncover several new configurations, such as black helicoids and catenoids, some of which have an asymptotically flat counterpart. In particular, we find that the ultraspinning regime of singly-spinning Myers-Perry black holes, described in terms of the simplest minimal surface (the plane), can be obtained as a limit of a black helicoid, suggesting that these two families of black holes are connected. We also show that minimal surfaces embedded in spheres rather than Euclidean space can be used to construct static compact horizons in asymptotically de Sitter space-times.
High-speed civil transport flight- and propulsion-control technological issues
NASA Technical Reports Server (NTRS)
Ray, J. K.; Carlin, C. M.; Lambregts, A. A.
1992-01-01
Technology advances required in the flight and propulsion control system disciplines to develop a high speed civil transport (HSCT) are identified. The mission and requirements of the transport and major flight and propulsion control technology issues are discussed. Each issue is ranked and, for each issue, a plan for technology readiness is given. Certain features are unique and dominate control system design. These features include the high temperature environment, large flexible aircraft, control-configured empennage, minimizing control margins, and high availability and excellent maintainability. The failure to resolve most high-priority issues can prevent the transport from achieving its goals. The flow-time for hardware may require stimulus, since market forces may be insufficient to ensure timely production. Flight and propulsion control technology will contribute to takeoff gross weight reduction. Similar technology advances are necessary also to ensure flight safety for the transport. The certification basis of the HSCT must be negotiated between airplane manufacturers and government regulators. Efficient, quality design of the transport will require an integrated set of design tools that support the entire engineering design team.
En Route Spacing System and Method
NASA Technical Reports Server (NTRS)
Erzberger, Heinz (Inventor); Green, Steven M. (Inventor)
2002-01-01
A method of and computer software for minimizing aircraft deviations needed to comply with an en route miles-in-trail spacing requirement imposed during air traffic control operations via establishing a spacing reference geometry, predicting spatial locations of a plurality of aircraft at a predicted time of intersection of a path of a first of said plurality of aircraft with the spacing reference geometry, and determining spacing of each of the plurality of aircraft based on the predicted spatial locations.
En route spacing system and method
NASA Technical Reports Server (NTRS)
Erzberger, Heinz (Inventor); Green, Steven M. (Inventor)
2002-01-01
A method of and computer software for minimizing aircraft deviations needed to comply with an en route miles-in-trail spacing requirement imposed during air traffic control operations via establishing a spacing reference geometry, predicting spatial locations of a plurality of aircraft at a predicted time of intersection of a path of a first of said plurality of aircraft with the spacing reference geometry, and determining spacing of each of the plurality of aircraft based on the predicted spatial locations.
More tooth, Less Skull: Force Structure Changes for an Uncertain Future
2012-05-17
deployments in OIF and OEF are the archetypes for force employment and span of control, highlighting the limits of modularity and the creation of ad hoc...square miles in extent. There was a wider dispersion of formations to minimize the effect of an enemy’s tactical atomic weapons. At the same time, this...The sweeping reorganization to meet the requirements of the atomic battlefield became the Pentomic Division. The Pentomic Division, officially known
Extracellular space preservation aids the connectomic analysis of neural circuits
Pallotto, Marta; Watkins, Paul V; Fubara, Boma; Singer, Joshua H; Briggman, Kevin L
2015-01-01
Dense connectomic mapping of neuronal circuits is limited by the time and effort required to analyze 3D electron microscopy (EM) datasets. Algorithms designed to automate image segmentation suffer from substantial error rates and require significant manual error correction. Any improvement in segmentation error rates would therefore directly reduce the time required to analyze 3D EM data. We explored preserving extracellular space (ECS) during chemical tissue fixation to improve the ability to segment neurites and to identify synaptic contacts. ECS preserved tissue is easier to segment using machine learning algorithms, leading to significantly reduced error rates. In addition, we observed that electrical synapses are readily identified in ECS preserved tissue. Finally, we determined that antibodies penetrate deep into ECS preserved tissue with only minimal permeabilization, thereby enabling correlated light microscopy (LM) and EM studies. We conclude that preservation of ECS benefits multiple aspects of the connectomic analysis of neural circuits. DOI: http://dx.doi.org/10.7554/eLife.08206.001 PMID:26650352
Duo, Jia; Dong, Huijin; DeSilva, Binodh; Zhang, Yan J
2013-07-01
Sample dilution and reagent pipetting are time-consuming steps in ligand-binding assays (LBAs). Traditional automation-assisted LBAs use assay-specific scripts that require labor-intensive script writing and user training. Five major script modules were developed on Tecan Freedom EVO liquid handling software to facilitate the automated sample preparation and LBA procedure: sample dilution, sample minimum required dilution, standard/QC minimum required dilution, standard/QC/sample addition, and reagent addition. The modular design of automation scripts allowed the users to assemble an automated assay with minimal script modification. The application of the template was demonstrated in three LBAs to support discovery biotherapeutic programs. The results demonstrated that the modular scripts provided the flexibility in adapting to various LBA formats and the significant time saving in script writing and scientist training. Data generated by the automated process were comparable to those by manual process while the bioanalytical productivity was significantly improved using the modular robotic scripts.
Development of prolonged standing strain index to quantify risk levels of standing jobs.
Halim, Isa; Omar, Abdul Rahman
2012-01-01
Many occupations in industry such as metal stamping workers, electronics parts assembly operators, automotive industry welders, and lathe operators require working in a standing posture for a long time. Prolonged standing can contribute to discomfort and muscle fatigue particularly in the back and legs. This study developed the prolonged standing strain index (PSSI) to quantify the risk levels caused by standing jobs, and proposed recommendations to minimize the risk levels. Risk factors associated with standing jobs, such as working posture, muscles activity, standing duration, holding time, whole-body vibration, and indoor air quality, were the basis for developing the PSSI. All risk factors were assigned multipliers, and the PSSI was the product of those multipliers. Recommendations for improvement are based on the PSSI; however, extensive studies are required to validate their effectiveness. multipliers, and the PSSI was the product of those multipliers. Recommendations for improvement are based on the PSSI; however, extensive studies are required to validate their effectiveness.
Rappoport, Louis H; Luna, Ingrid Y; Joshua, Gita
2017-05-01
Proper diagnosis and treatment of sacroiliac joint (SIJ) pain remains a clinical challenge. Dysfunction of the SIJ can produce pain in the lower back, buttocks, and extremities. Triangular titanium implants for minimally invasive surgical arthrodesis have been available for several years, with reputed high levels of success and patient satisfaction. This study reports on a novel hydroxyapatite-coated screw for surgical treatment of SIJ pain. Data were prospectively collected on 32 consecutive patients who underwent minimally invasive SIJ fusion with a novel hydroxyapatite-coated screw. Clinical assessments and radiographs were collected and evaluated at 3, 6, and 12 months postoperatively. Mean (standard deviation) patient age was 55.2 ± 10.7 years, and 62.5% were female. More patients (53.1%) underwent left versus right SIJ treatment, mean operative time was 42.6 ± 20.4 minutes, and estimated blood loss did not exceed 50 mL. Overnight hospital stay was required for 84% of patients, and the remaining patients needed a 2-day stay (16%). Mean preoperative visual analog scale back and leg pain scores decreased significantly by 12 months postoperatively (P < 0.01). Mechanical stability was achieved in 93.3% (28/30) of patients, and all patients who were employed preoperatively returned to work within 3 months. Two patients who required revision surgery reported symptom improvement within 3 weeks and did not require subsequent surgery. Positive clinical outcomes are reported 1 year postoperatively after implantation of a novel implant to treat sacroiliac joint pain. Future clinical studies with larger samples are warranted to assess long-term patient outcomes. Copyright © 2017 Elsevier Inc. All rights reserved.
The Strategic WAste Minimization Initiative (SWAMI) Software, Version 2.0 is a tool for using process analysis for identifying waste minimization opportunities within an industrial setting. The software requires user-supplied information for process definition, as well as materia...
Aeroelasticity of morphing wings using neural networks
NASA Astrophysics Data System (ADS)
Natarajan, Anand
In this dissertation, neural networks are designed to effectively model static non-linear aeroelastic problems in adaptive structures and linear dynamic aeroelastic systems with time varying stiffness. The use of adaptive materials in aircraft wings allows for the change of the contour or the configuration of a wing (morphing) in flight. The use of smart materials, to accomplish these deformations, can imply that the stiffness of the wing with a morphing contour changes as the contour changes. For a rapidly oscillating body in a fluid field, continuously adapting structural parameters may render the wing to behave as a time variant system. Even the internal spars/ribs of the aircraft wing which define the wing stiffness can be made adaptive, that is, their stiffness can be made to vary with time. The immediate effect on the structural dynamics of the wing, is that, the wing motion is governed by a differential equation with time varying coefficients. The study of this concept of a time varying torsional stiffness, made possible by the use of active materials and adaptive spars, in the dynamic aeroelastic behavior of an adaptable airfoil is performed here. Another type of aeroelastic problem of an adaptive structure that is investigated here, is the shape control of an adaptive bump situated on the leading edge of an airfoil. Such a bump is useful in achieving flow separation control for lateral directional maneuverability of the aircraft. Since actuators are being used to create this bump on the wing surface, the energy required to do so needs to be minimized. The adverse pressure drag as a result of this bump needs to be controlled so that the loss in lift over the wing is made minimal. The design of such a "spoiler bump" on the surface of the airfoil is an optimization problem of maximizing pressure drag due to flow separation while minimizing the loss in lift and energy required to deform the bump. One neural network is trained using the CFD code FLUENT to represent the aerodynamic loading over the bump. A second neural network is trained for calculating the actuator loads, bump displacement and lift, drag forces over the airfoil using the finite element solver, ANSYS and the previously trained neural network. This non-linear aeroelastic model of the deforming bump on an airfoil surface using neural networks can serve as a fore-runner for other non-linear aeroelastic problems.
A Decentralized Scheduling Policy for a Dynamically Reconfigurable Production System
NASA Astrophysics Data System (ADS)
Giordani, Stefano; Lujak, Marin; Martinelli, Francesco
In this paper, the static layout of a traditional multi-machine factory producing a set of distinct goods is integrated with a set of mobile production units - robots. The robots dynamically change their work position to increment the product rate of the different typologies of products in respect to the fluctuations of the demands and production costs during a given time horizon. Assuming that the planning time horizon is subdivided into a finite number of time periods, this particularly flexible layout requires the definition and the solution of a complex scheduling problem, involving for each period of the planning time horizon, the determination of the position of the robots, i.e., the assignment to the respective tasks in order to minimize production costs given the product demand rates during the planning time horizon.
Hidri, Lotfi; Gharbi, Anis; Louly, Mohamed Aly
2014-01-01
We focus on the two-center hybrid flow shop scheduling problem with identical parallel machines and removal times. The job removal time is the required duration to remove it from a machine after its processing. The objective is to minimize the maximum completion time (makespan). A heuristic and a lower bound are proposed for this NP-Hard problem. These procedures are based on the optimal solution of the parallel machine scheduling problem with release dates and delivery times. The heuristic is composed of two phases. The first one is a constructive phase in which an initial feasible solution is provided, while the second phase is an improvement one. Intensive computational experiments have been conducted to confirm the good performance of the proposed procedures.
Efficient Bounding Schemes for the Two-Center Hybrid Flow Shop Scheduling Problem with Removal Times
2014-01-01
We focus on the two-center hybrid flow shop scheduling problem with identical parallel machines and removal times. The job removal time is the required duration to remove it from a machine after its processing. The objective is to minimize the maximum completion time (makespan). A heuristic and a lower bound are proposed for this NP-Hard problem. These procedures are based on the optimal solution of the parallel machine scheduling problem with release dates and delivery times. The heuristic is composed of two phases. The first one is a constructive phase in which an initial feasible solution is provided, while the second phase is an improvement one. Intensive computational experiments have been conducted to confirm the good performance of the proposed procedures. PMID:25610911
Low-Friction, High-Stiffness Joint for Uniaxial Load Cell
NASA Technical Reports Server (NTRS)
Lewis, James L.; Le, Thang; Carroll, Monty B.
2007-01-01
A universal-joint assembly has been devised for transferring axial tension or compression to a load cell. To maximize measurement accuracy, the assembly is required to minimize any moments and non-axial forces on the load cell and to exhibit little or no hysteresis. The requirement to minimize hysteresis translates to a requirement to maximize axial stiffness (including minimizing backlash) and a simultaneous requirement to minimize friction. In practice, these are competing requirements, encountered repeatedly in efforts to design universal joints. Often, universal-joint designs represent compromises between these requirements. The improved universal-joint assembly contains two universal joints, each containing two adjustable pairs of angular-contact ball bearings. One might be tempted to ask why one could not use simple ball-and-socket joints rather than something as complex as universal joints containing adjustable pairs of angularcontact ball bearings. The answer is that ball-and-socket joints do not offer sufficient latitude to trade stiffness versus friction: the inevitable result of an attempt to make such a trade in a ball-and-socket joint is either too much backlash or too much friction. The universal joints are located at opposite ends of an axial subassembly that contains the load cell. The axial subassembly includes an axial shaft, an axial housing, and a fifth adjustable pair of angular-contact ball bearings that allows rotation of the axial housing relative to the shaft. The preload on each pair of angular-contact ball bearings can be adjusted to obtain the required stiffness with minimal friction, tailored for a specific application. The universal joint at each end affords two degrees of freedom, allowing only axial force to reach the load cell regardless of application of moments and non-axial forces. The rotational joint on the axial subassembly affords a fifth degree of freedom, preventing application of a torsion load to the load cell.
Saving Material with Systematic Process Designs
NASA Astrophysics Data System (ADS)
Kerausch, M.
2011-08-01
Global competition is forcing the stamping industry to further increase quality, to shorten time-to-market and to reduce total cost. Continuous balancing between these classical time-cost-quality targets throughout the product development cycle is required to ensure future economical success. In today's industrial practice, die layout standards are typically assumed to implicitly ensure the balancing of company specific time-cost-quality targets. Although die layout standards are a very successful approach, there are two methodical disadvantages. First, the capabilities for tool design have to be continuously adapted to technological innovations; e.g. to take advantage of the full forming capability of new materials. Secondly, the great variety of die design aspects have to be reduced to a generic rule or guideline; e.g. binder shape, draw-in conditions or the use of drawbeads. Therefore, it is important to not overlook cost or quality opportunities when applying die design standards. This paper describes a systematic workflow with focus on minimizing material consumption. The starting point of the investigation is a full process plan for a typical structural part. All requirements are definedaccording to a predefined set of die design standards with industrial relevance are fulfilled. In a first step binder and addendum geometry is systematically checked for material saving potentials. In a second step, blank shape and draw-in are adjusted to meet thinning, wrinkling and springback targets for a minimum blank solution. Finally the identified die layout is validated with respect to production robustness versus splits, wrinkles and springback. For all three steps the applied methodology is based on finite element simulation combined with a stochastical variation of input variables. With the proposed workflow a well-balanced (time-cost-quality) production process assuring minimal material consumption can be achieved.
NASA Technical Reports Server (NTRS)
Ha, Kong Q.; Femiano, Michael D.; Mosier, Gary E.
2004-01-01
In this paper, we present an optimal open-loop slew trajectory algorithm developed at GSFC for the so-called "Yardstick design" of the James Webb Space Telescope (JWST). JWST is an orbiting infrared observatory featuring a lightweight, segmented primary mirror approximately 6 meters in diameter and a sunshield approximately the size of a tennis court. This large, flexible structure will have significant number of lightly damped, dominant flexible modes. With very stringent requirements on pointing accuracy and image quality, it is important that slewing be done within the required time constraint and with minimal induced vibration in order to maximize observing efficiency. With reaction wheels as control actuators, initial wheel speeds as well as individual wheel torque and momentum limits become dominant constraints in slew performance. These constraints must be taken into account when performing slews to ensure that unexpected reaction wheel saturation does not occur, since such saturation leads to control failure in accurately tracking commanded motion and produces high frequency torque components capable of exciting structural modes. A minimum-time constraint is also included and coupled with reaction wheel limit constraints in the optimization to minimize both the effect of the control torque on the flexible body motion and the maneuver time. The optimization is on slew command parameters, such as maximum slew velocity and acceleration, for a given redundant reaction wheel configuration and is based on the dynamic interaction between the spacecraft and reaction wheel motion. Analytical development of the slew algorithm to generate desired slew position, rate, and acceleration profiles to command a feedback/feed forward control system is described. High-fidelity simulation and experimental results are presented to show that the developed slew law achieves the objectives.
Sideband Algorithm for Automatic Wind Turbine Gearbox Fault Detection and Diagnosis: Preprint
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zappala, D.; Tavner, P.; Crabtree, C.
2013-01-01
Improving the availability of wind turbines (WT) is critical to minimize the cost of wind energy, especially for offshore installations. As gearbox downtime has a significant impact on WT availabilities, the development of reliable and cost-effective gearbox condition monitoring systems (CMS) is of great concern to the wind industry. Timely detection and diagnosis of developing gear defects within a gearbox is an essential part of minimizing unplanned downtime of wind turbines. Monitoring signals from WT gearboxes are highly non-stationary as turbine load and speed vary continuously with time. Time-consuming and costly manual handling of large amounts of monitoring data representmore » one of the main limitations of most current CMSs, so automated algorithms are required. This paper presents a fault detection algorithm for incorporation into a commercial CMS for automatic gear fault detection and diagnosis. The algorithm allowed the assessment of gear fault severity by tracking progressive tooth gear damage during variable speed and load operating conditions of the test rig. Results show that the proposed technique proves efficient and reliable for detecting gear damage. Once implemented into WT CMSs, this algorithm can automate data interpretation reducing the quantity of information that WT operators must handle.« less
Propagation speed of a starting wave in a queue of pedestrians.
Tomoeda, Akiyasu; Yanagisawa, Daichi; Imamura, Takashi; Nishinari, Katsuhiro
2012-09-01
The propagation speed of a starting wave, which is a wave of people's successive reactions in the relaxation process of a queue, has an essential role for pedestrians and vehicles to achieve smooth movement. For example, a queue of vehicles with appropriate headway (or density) alleviates traffic jams since the delay of reaction to start is minimized. In this paper, we have investigated the fundamental relation between the propagation speed of a starting wave and the initial density by both our mathematical model built on the stochastic cellular automata and experimental measurements. Analysis of our mathematical model implies that the relation is characterized by the power law αρ-β (β≠1), and the experimental results verify this feature. Moreover, when the starting wave is characterized by the power law (β>1), we have revealed the existence of optimal density, where the required time, i.e., the sum of the waiting time until the starting wave reaches the last pedestrian in a queue and his/her travel time to pass the head position of the initial queue, is minimized. This optimal density inevitably plays a significant role in achieving a smooth movement of crowds and vehicles in a queue.
Decomposition technique and optimal trajectories for the aeroassisted flight experiment
NASA Technical Reports Server (NTRS)
Miele, A.; Wang, T.; Deaton, A. W.
1990-01-01
An actual geosynchronous Earth orbit-to-low Earth orbit (GEO-to-LEO) transfer is considered with reference to the aeroassisted flight experiment (AFE) spacecraft, and optimal trajectories are determined by minimizing the total characteristic velocity. The optimization is performed with respect to the time history of the controls (angle of attack and angle of bank), the entry path inclination and the flight time being free. Two transfer maneuvers are considered: direct ascent (DA) to LEO and indirect ascent (IA) to LEO via parking Earth orbit (PEO). By taking into account certain assumptions, the complete system can be decoupled into two subsystems: one describing the longitudinal motion and one describing the lateral motion. The angle of attack history, the entry path inclination, and the flight time are determined via the longitudinal motion subsystem. In this subsystem, the difference between the instantaneous bank angle and a constant bank angle is minimized in the least square sense subject to the specified orbital inclination requirement. Both the angles of attack and the angle of bank are shown to be constant. This result has considerable importance in the design of nominal trajectories to be used in the guidance of AFE and aeroassisted orbital transfer (AOT) vehicles.
Mallik, Tanuja; Aneja, S; Tope, R; Muralidhar, V
2012-01-01
Background: In the administration of minimal flow anesthesia, traditionally a fixed time period of high flow has been used before changing over to minimal flow. However, newer studies have used “equilibration time” of a volatile anesthetic agent as the change-over point. Materials and Methods: A randomized prospective study was conducted on 60 patients, who were divided into two groups of 30 patients each. Two volatile inhalational anesthetic agents were compared. Group I received desflurane (n = 30) and group II isoflurane (n = 30). Both the groups received an initial high flow till equilibration between inspired (Fi) and expired (Fe) agent concentration were achieved, which was defined as Fe/Fi = 0.8. The mean (SD) equilibration time was obtained for both the agent. Then, a drift in end-tidal agent concentration during the minimal flow anesthesia and recovery profile was noted. Results: The mean equilibration time obtained for desflurane and isoflurane were 4.96 ± 1.60 and 16.96 ± 9.64 min (P < 0.001). The drift in end-tidal agent concentration over time was minimal in the desflurane group (P = 0.065). Recovery time was 5.70 ± 2.78 min in the desflurane group and 8.06 ± 31 min in the isoflurane group (P = 0.004). Conclusion: Use of equilibration time of the volatile anesthetic agent as a change-over point, from high flow to minimal flow, can help us use minimal flow anesthesia, in a more efficient way. PMID:23225926
LSST: Cadence Design and Simulation
NASA Astrophysics Data System (ADS)
Cook, Kem H.; Pinto, P. A.; Delgado, F.; Miller, M.; Petry, C.; Saha, A.; Gee, P. A.; Tyson, J. A.; Ivezic, Z.; Jones, L.; LSST Collaboration
2009-01-01
The LSST Project has developed an operations simulator to investigate how best to observe the sky to achieve its multiple science goals. The simulator has a sophisticated model of the telescope and dome to properly constrain potential observing cadences. This model has also proven useful for investigating various engineering issues ranging from sizing of slew motors, to design of cryogen lines to the camera. The simulator is capable of balancing cadence goals from multiple science programs, and attempts to minimize time spent slewing as it carries out these goals. The operations simulator has been used to demonstrate a 'universal' cadence which delivers the science requirements for a deep cosmology survey, a Near Earth Object Survey and good sampling in the time domain. We will present the results of simulating 10 years of LSST operations using realistic seeing distributions, historical weather data, scheduled engineering downtime and current telescope and camera parameters. These simulations demonstrate the capability of the LSST to deliver a 25,000 square degree survey probing the time domain including 20,000 square degrees for a uniform deep, wide, fast survey, while effectively surveying for NEOs over the same area. We will also present our plans for future development of the simulator--better global minimization of slew time and eventual transition to a scheduler for the real LSST.
Zhao, Ximei; Ren, Chengyi; Liu, Hao; Li, Haogyi
2014-12-01
Robotic catheter minimally invasive operation requires that the driver control system has the advantages of quick response, strong anti-jamming and real-time tracking of target trajectory. Since the catheter parameters of itself and movement environment and other factors continuously change, when the driver is controlled using traditional proportional-integral-derivative (PID), the controller gain becomes fixed once the PID parameters are set. It can not change with the change of the parameters of the object and environmental disturbance so that its change affects the position tracking accuracy, and may bring a large overshoot endangering patients' vessel. Therefore, this paper adopts fuzzy PID control method to adjust PID gain parameters in the tracking process in order to improve the system anti-interference ability, dynamic performance and tracking accuracy. The simulation results showed that the fuzzy PID control method had a fast tracking performance and a strong robustness. Compared with those of traditional PID control, the feasibility and practicability of fuzzy PID control are verified in a robotic catheter minimally invasive operation.
Water and wastewater minimization plan in food industries.
Ganjidoust, H; Ayati, B
2002-01-01
Iran is one of the countries located in a dry and semi-dry area. Many provinces like Tehran are facing problems in recent years because of less precipitation. For reduction in wastewater treatment cost and water consumption, many research works have been carried out. One of them concerns food industries group, which consumes a great amount of water in different units. For example, in beverage industries, washing of glass bottles seven times requires large amounts of water but use of plastic bottles can reduce water consumption. Another problem is leakage from pipelines, valves, etc. Their repair plays an important role in the wastage of water. The non-polluted wasted water can be used in washing halls, watering green yards, recycling to the process or reusing in cooling towers. In this paper, after a short review of waste minimization plans in food industries, problems concerning water consuming and wastewater producing units in three Iranian food industries have been investigated. At the end, some suggestions have been given for implementing the water and wastewater minimization plan in the companies.
Robust model-based 3d/3D fusion using sparse matching for minimally invasive surgery.
Neumann, Dominik; Grbic, Sasa; John, Matthias; Navab, Nassir; Hornegger, Joachim; Ionasec, Razvan
2013-01-01
Classical surgery is being disrupted by minimally invasive and transcatheter procedures. As there is no direct view or access to the affected anatomy, advanced imaging techniques such as 3D C-arm CT and C-arm fluoroscopy are routinely used for intra-operative guidance. However, intra-operative modalities have limited image quality of the soft tissue and a reliable assessment of the cardiac anatomy can only be made by injecting contrast agent, which is harmful to the patient and requires complex acquisition protocols. We propose a novel sparse matching approach for fusing high quality pre-operative CT and non-contrasted, non-gated intra-operative C-arm CT by utilizing robust machine learning and numerical optimization techniques. Thus, high-quality patient-specific models can be extracted from the pre-operative CT and mapped to the intra-operative imaging environment to guide minimally invasive procedures. Extensive quantitative experiments demonstrate that our model-based fusion approach has an average execution time of 2.9 s, while the accuracy lies within expert user confidence intervals.
Caronia, Francesco Paolo; Arrigo, Ettore; Failla, Andrea Valentino; Sgalambro, Francesco; Giannone, Giorgio; Lo Monte, Attilio Ignazio; Cajozzo, Massimo; Santini, Mario
2018-01-01
A 67-year-old man was referred to our attention for management of esophageal adenocarcinoma, localized at the level of the esophagogastric junction and obstructed the 1/3 of the esophageal lumen. Due to the extension of the disease (T3N1M0-Stage IIIA), the patient underwent neo-adjuvant chemo-radiation therapy and he was then scheduled for a minimally invasive surgical procedure including laparoscopic gastroplasty, uniportal thoracoscopic esophageal dissection and intrathoracic end-to-end esophago-gastric anastomosis. No intraoperative and post-operative complications were seen. The patient was discharged in post-operative day 9. Pathological study confirmed the diagnosis of adenocarcinoma (T2N1M0-Stage IIB) and he underwent adjuvant chemotherapy. At the time of present paper, patient is alive and well without signs of recurrence or metastasis. Our minimally approach compared to standard open procedure would help reduce post-operative pain and favours early return to normal activity. However, future experiences with a control group are required before our strategy can be widely used. PMID:29850166
Caronia, Francesco Paolo; Arrigo, Ettore; Failla, Andrea Valentino; Sgalambro, Francesco; Giannone, Giorgio; Lo Monte, Attilio Ignazio; Cajozzo, Massimo; Santini, Mario; Fiorelli, Alfonso
2018-04-01
A 67-year-old man was referred to our attention for management of esophageal adenocarcinoma, localized at the level of the esophagogastric junction and obstructed the 1/3 of the esophageal lumen. Due to the extension of the disease (T3N1M0-Stage IIIA), the patient underwent neo-adjuvant chemo-radiation therapy and he was then scheduled for a minimally invasive surgical procedure including laparoscopic gastroplasty, uniportal thoracoscopic esophageal dissection and intrathoracic end-to-end esophago-gastric anastomosis. No intraoperative and post-operative complications were seen. The patient was discharged in post-operative day 9. Pathological study confirmed the diagnosis of adenocarcinoma (T2N1M0-Stage IIB) and he underwent adjuvant chemotherapy. At the time of present paper, patient is alive and well without signs of recurrence or metastasis. Our minimally approach compared to standard open procedure would help reduce post-operative pain and favours early return to normal activity. However, future experiences with a control group are required before our strategy can be widely used.
NASA Astrophysics Data System (ADS)
Hoerning, Sebastian; Bardossy, Andras; du Plessis, Jaco
2017-04-01
Most geostatistical inverse groundwater flow and transport modelling approaches utilize a numerical solver to minimize the discrepancy between observed and simulated hydraulic heads and/or hydraulic concentration values. The optimization procedure often requires many model runs, which for complex models lead to long run times. Random Mixing is a promising new geostatistical technique for inverse modelling. The method is an extension of the gradual deformation approach. It works by finding a field which preserves the covariance structure and maintains observed hydraulic conductivities. This field is perturbed by mixing it with new fields that fulfill the homogeneous conditions. This mixing is expressed as an optimization problem which aims to minimize the difference between the observed and simulated hydraulic heads and/or concentration values. To preserve the spatial structure, the mixing weights must lie on the unit hyper-sphere. We present a modification to the Random Mixing algorithm which significantly reduces the number of model runs required. The approach involves taking n equally spaced points on the unit circle as weights for mixing conditional random fields. Each of these mixtures provides a solution to the forward model at the conditioning locations. For each of the locations the solutions are then interpolated around the circle to provide solutions for additional mixing weights at very low computational cost. The interpolated solutions are used to search for a mixture which maximally reduces the objective function. This is in contrast to other approaches which evaluate the objective function for the n mixtures and then interpolate the obtained values. Keeping the mixture on the unit circle makes it easy to generate equidistant sampling points in the space; however, this means that only two fields are mixed at a time. Once the optimal mixture for two fields has been found, they are combined to form the input to the next iteration of the algorithm. This process is repeated until a threshold in the objective function is met or insufficient changes are produced in successive iterations.
Characterizing DebriSat Fragments: So Many Fragments, So Much Data, and So Little Time
NASA Technical Reports Server (NTRS)
Shiotani, B.; Rivero, M.; Carrasquilla, M.; Allen, S.; Fitz-Coy, N.; Liou, J.-C.; Huynh, T.; Sorge, M.; Cowardin, H.; Opiela, J.;
2017-01-01
To improve prediction accuracy, the DebriSat project was conceived by NASA and DoD to update existing standard break-up models. Updating standard break-up models require detailed fragment characteristics such as physical size, material properties, bulk density, and ballistic coefficient. For the DebriSat project, a representative modern LEO spacecraft was developed and subjected to a laboratory hypervelocity impact test and all generated fragments with at least one dimension greater than 2 mm are collected, characterized and archived. Since the beginning of the characterization phase of the DebriSat project, over 130,000 fragments have been collected and approximately 250,000 fragments are expected to be collected in total, a three-fold increase over the 85,000 fragments predicted by the current break-up model. The challenge throughout the project has been to ensure the integrity and accuracy of the characteristics of each fragment. To this end, the post hypervelocity-impact test activities, which include fragment collection, extraction, and characterization, have been designed to minimize handling of the fragments. The procedures for fragment collection, extraction, and characterization were painstakingly designed and implemented to maintain the post-impact state of the fragments, thus ensuring the integrity and accuracy of the characterization data. Each process is designed to expedite the accumulation of data, however, the need for speed is restrained by the need to protect the fragments. Methods to expedite the process such as parallel processing have been explored and implemented while continuing to maintain the highest integrity and value of the data. To minimize fragment handling, automated systems have been developed and implemented. Errors due to human inputs are also minimized by the use of these automated systems. This paper discusses the processes and challenges involved in the collection, extraction, and characterization of the fragments as well as the time required to complete the processes. The objective is to provide the orbital debris community an understanding of the scale of the effort required to generate and archive high quality data and metadata for each debris fragment 2 mm or larger generated by the DebriSat project.
Gray, Richard A; Pathmanathan, Pras
2016-10-01
Elucidating the underlying mechanisms of fatal cardiac arrhythmias requires a tight integration of electrophysiological experiments, models, and theory. Existing models of transmembrane action potential (AP) are complex (resulting in over parameterization) and varied (leading to dissimilar predictions). Thus, simpler models are needed to elucidate the "minimal physiological requirements" to reproduce significant observable phenomena using as few parameters as possible. Moreover, models have been derived from experimental studies from a variety of species under a range of environmental conditions (for example, all existing rabbit AP models incorporate a formulation of the rapid sodium current, INa, based on 30 year old data from chick embryo cell aggregates). Here we develop a simple "parsimonious" rabbit AP model that is mathematically identifiable (i.e., not over parameterized) by combining a novel Hodgkin-Huxley formulation of INa with a phenomenological model of repolarization similar to the voltage dependent, time-independent rectifying outward potassium current (IK). The model was calibrated using the following experimental data sets measured from the same species (rabbit) under physiological conditions: dynamic current-voltage (I-V) relationships during the AP upstroke; rapid recovery of AP excitability during the relative refractory period; and steady-state INa inactivation via voltage clamp. Simulations reproduced several important "emergent" phenomena including cellular alternans at rates > 250 bpm as observed in rabbit myocytes, reentrant spiral waves as observed on the surface of the rabbit heart, and spiral wave breakup. Model variants were studied which elucidated the minimal requirements for alternans and spiral wave break up, namely the kinetics of INa inactivation and the non-linear rectification of IK.The simplicity of the model, and the fact that its parameters have physiological meaning, make it ideal for engendering generalizable mechanistic insight and should provide a solid "building-block" to generate more detailed ionic models to represent complex rabbit electrophysiology.
Design for disassembly and sustainability assessment to support aircraft end-of-life treatment
NASA Astrophysics Data System (ADS)
Savaria, Christian
Gas turbine engine design is a multidisciplinary and iterative process. Many design iterations are necessary to address the challenges among the disciplines. In the creation of a new engine architecture, the design time is crucial in capturing new business opportunities. At the detail design phase, it was proven very difficult to correct an unsatisfactory design. To overcome this difficulty, the concept of Multi-Disciplinary Optimization (MDO) at the preliminary design phase (Preliminary MDO or PMDO) is used allowing more freedom to perform changes in the design. PMDO also reduces the design time at the preliminary design phase. The concept of PMDO was used was used to create parametric models, and new correlations for high pressure gas turbine housing and shroud segments towards a new design process. First, dedicated parametric models were created because of their reusability and versatility. Their ease of use compared to non-parameterized models allows more design iterations thus reduces set up and design time. Second, geometry correlations were created to minimize the number of parameters used in turbine housing and shroud segment design. Since the turbine housing and the shroud segment geometries are required in tip clearance analyses, care was taken as to not oversimplify the parametric formulation. In addition, a user interface was developed to interact with the parametric models and improve the design time. Third, the cooling flow predictions require many engine parameters (i.e. geometric and performance parameters and air properties) and a reference shroud segments. A second correlation study was conducted to minimize the number of engine parameters required in the cooling flow predictions and to facilitate the selection of a reference shroud segment. Finally, the parametric models, the geometry correlations, and the user interface resulted in a time saving of 50% and an increase in accuracy of 56% in the new design system compared to the existing design system. Also, regarding the cooling flow correlations, the number of engine parameters was reduced by a factor of 6 to create a simplified prediction model and hence a faster shroud segment selection process. None
Improving both imaging speed and spatial resolution in MR-guided neurosurgery
NASA Astrophysics Data System (ADS)
Liu, Haiying; Hall, Walter A.; Truwit, Charles L.
2002-05-01
A robust near real-time MRI based surgical guidance scheme has been developed and used in neurosurgical procedure performed in our combined 1.5 Tesla MR operating room. Because of the increased susceptibility difference in the area of surgical site during surgery, the preferred real- time imaging technique is a single shot imaging sequence based on the concept of the half acquisition with turbo spin echoes (HASTE). In order to maintain sufficient spatial resolution for visualizing the surgical devices, such as a biopsy needle and catheter, we used focused field of view (FOV) in the phase-encoding (PE) direction coupled with an out-volume signal suppression (OVS) technique. The key concept of the method is to minimize the total number of the required phase encoding steps and the effective echo time (TE) as well as the longest TE for the high spatial encoding step. The concept has been first demonstrated with a phantom experiment, which showed when the water was doped with Gd- DTPA to match the relaxation rates of the brain tissue there was a significant spatial blurring primarily along the phase encoding direction if the conventional HASTE technique, and the new scheme indeed minimized the spatial blur in the resulting image and improved the needle visualization as anticipated. Using the new scheme in a typical MR-guided neurobiopsy procedure, the brain biopsy needle was easily seen against the tissue background with minimal blurring due the inevitable T2 signal decay even when the PE direction was set parallel to the needle axis. This MR based guidance technique has practically allowed neurosurgeons to visualize the biopsy needle and to monitor its insertion with a better certainty at near real-time pace.
Next Generation Polar Seismic Instrumentation Challenges
NASA Astrophysics Data System (ADS)
Parker, T.; Beaudoin, B. C.; Gridley, J.; Anderson, K. R.
2011-12-01
Polar region logistics are the limiting factor for deploying deep field seismic arrays. The IRIS PASSCAL Instrument Center, in collaboration with UNAVCO, designed and deployed several systems that address some of the logistical constraints of polar deployments. However, continued logistics' pressures coupled with increasingly ambitious science projects require further reducing the logistics required for deploying both summer and over winter stations. Our focus is to reduce station power requirements and bulk, thereby minimizing the time and effort required to deploy these arrays. We will reduce the weight of the battery bank by incorporating the most applicable new high energy-density battery technology. Using these batteries will require a completely new power management system along with an appropriate smart enclosure. The other aspect will be to integrate the digitizing system with the sensor. Both of these technologies should reduce the install time and shipping volume plus weight while reducing some instrument costs. We will also continue work on an effective Iridium telemetry solution for automated data return. The costs and limitations of polar deep-field science easily justifies a specialized development effort but pays off doubly in that we will continue to leverage the advancements in reduced logistics and increased performance for the benefit of low-latitude seismic research.
An efficient sparse matrix multiplication scheme for the CYBER 205 computer
NASA Technical Reports Server (NTRS)
Lambiotte, Jules J., Jr.
1988-01-01
This paper describes the development of an efficient algorithm for computing the product of a matrix and vector on a CYBER 205 vector computer. The desire to provide software which allows the user to choose between the often conflicting goals of minimizing central processing unit (CPU) time or storage requirements has led to a diagonal-based algorithm in which one of four types of storage is selected for each diagonal. The candidate storage types employed were chosen to be efficient on the CYBER 205 for diagonals which have nonzero structure which is dense, moderately sparse, very sparse and short, or very sparse and long; however, for many densities, no diagonal type is most efficient with respect to both resource requirements, and a trade-off must be made. For each diagonal, an initialization subroutine estimates the CPU time and storage required for each storage type based on results from previously performed numerical experimentation. These requirements are adjusted by weights provided by the user which reflect the relative importance the user places on the two resources. The adjusted resource requirements are then compared to select the most efficient storage and computational scheme.
ERIC Educational Resources Information Center
Kroeze, Willemieke; Oenema, Anke; Dagnelie, Pieter C.; Brug, Johannes
2008-01-01
This study investigated the minimally required feedback elements of a computer-tailored dietary fat reduction intervention to be effective in improving fat intake. In all 588 Healthy Dutch adults were randomly allocated to one of four conditions in an randomized controlled trial: (i) feedback on dietary fat intake [personal feedback (P feedback)],…
Laitila, Jussi; Moilanen, Atte; Pouzols, Federico M
2014-01-01
Biodiversity offsetting, which means compensation for ecological and environmental damage caused by development activity, has recently been gaining strong political support around the world. One common criticism levelled at offsets is that they exchange certain and almost immediate losses for uncertain future gains. In the case of restoration offsets, gains may be realized after a time delay of decades, and with considerable uncertainty. Here we focus on offset multipliers, which are ratios between damaged and compensated amounts (areas) of biodiversity. Multipliers have the attraction of being an easily understandable way of deciding the amount of offsetting needed. On the other hand, exact values of multipliers are very difficult to compute in practice if at all possible. We introduce a mathematical method for deriving minimum levels for offset multipliers under the assumption that offsetting gains must compensate for the losses (no net loss offsetting). We calculate absolute minimum multipliers that arise from time discounting and delayed emergence of offsetting gains for a one-dimensional measure of biodiversity. Despite the highly simplified model, we show that even the absolute minimum multipliers may easily be quite large, in the order of dozens, and theoretically arbitrarily large, contradicting the relatively low multipliers found in literature and in practice. While our results inform policy makers about realistic minimal offsetting requirements, they also challenge many current policies and show the importance of rigorous models for computing (minimum) offset multipliers. The strength of the presented method is that it requires minimal underlying information. We include a supplementary spreadsheet tool for calculating multipliers to facilitate application. PMID:25821578
Recent advances in technologies required for a "Salad Machine".
Kliss, M; Heyenga, A G; Hoehn, A; Stodieck, L S
2000-01-01
Future long duration, manned space flight missions will require life support systems that minimize resupply requirements and ultimately approach self-sufficiency in space. Bioregenerative life support systems are a promising approach, but they are far from mature. Early in the development of the NASA Controlled Ecological Life Support System Program, the idea of onboard cultivation of salad-type vegetables for crew consumption was proposed as a first step away from the total reliance on resupply for food in space. Since that time, significant advances in space-based plant growth hardware have occurred, and considerable flight experience has been gained. This paper revisits the "Salad Machine" concept and describes recent developments in subsystem technologies for both plant root and shoot environments that are directly relevant to the development of such a facility.
NASA Technical Reports Server (NTRS)
Ratner, R. S.; Shapiro, E. B.; Zeidler, H. M.; Wahlstrom, S. E.; Clark, C. B.; Goldberg, J.
1973-01-01
This final report summarizes the work on the design of a fault tolerant digital computer for aircraft. Volume 2 is composed of two parts. Part 1 is concerned with the computational requirements associated with an advanced commercial aircraft. Part 2 reviews the technology that will be available for the implementation of the computer in the 1975-1985 period. With regard to the computation task 26 computations have been categorized according to computational load, memory requirements, criticality, permitted down-time, and the need to save data in order to effect a roll-back. The technology part stresses the impact of large scale integration (LSI) on the realization of logic and memory. Also considered was module interconnection possibilities so as to minimize fault propagation.
Recent Advances in Technologies Required for a ``Salad Machine''
NASA Astrophysics Data System (ADS)
Kliss, M.; Heyenga, A. G.; Hoehn, A.; Stodieck, L. S.
Future long duration, manned space flight missions will require life support systems that minimize resupply requirements and ultimately approach self-sufficiency in space. Bioregenerative life support systems are a promising approach, but they are far from mature. Early in the development of the NASA Controlled Ecological Life Support System Program, the idea of onboard cultivation of salad-type vegetables for crew consumption was proposed as a first step away from the total reliance on resupply for food in space. Since that time, significant advances in space-based plant growth hardware have occurred, and considerable flight experience has been gained. This paper revisits the ``Salad Machine'' concept and describes recent developments in subsystem technologies for both plant root and shoot environments that are directly relevant to the development of such a facility
Hauer, Grant; Vic Adamowicz, W L; Boutin, Stan
2018-07-15
Tradeoffs between cost and recovery targets for boreal caribou herds, threatened species in Alberta, Canada, are examined using a dynamic cost minimization model. Unlike most approaches used for minimizing costs of achieving threatened species targets, we incorporate opportunity costs of surface (forests) and subsurface resources (energy) as well as direct costs of conservation (habitat restoration and direct predator control), into a forward looking model of species protection. Opportunity costs of conservation over time are minimized with an explicit target date for meeting species recovery targets; defined as the number of self-sustaining caribou herds, which requires that both habitat and population targets are met by a set date. The model was run under various scenarios including three species recovery criteria, two oil and gas price regimes, and targets for the number of herds to recover from 1 to 12. The derived cost curve follows a typical pattern as costs of recovery per herd increase as the number of herds targeted for recovery increases. The results also show that the opportunity costs for direct predator control are small compared to habitat restoration and protection costs. However, direct predator control is essential for meeting caribou population targets and reducing the risk of extirpation while habitat is recovered over time. Copyright © 2018 Elsevier Ltd. All rights reserved.
Dynamic model of the octopus arm. II. Control of reaching movements.
Yekutieli, Yoram; Sagiv-Zohar, Roni; Hochner, Binyamin; Flash, Tamar
2005-08-01
The dynamic model of the octopus arm described in the first paper of this 2-part series was used here to investigate the neural strategies used for controlling the reaching movements of the octopus arm. These are stereotypical extension movements used to reach toward an object. In the dynamic model, sending a simple propagating neural activation signal to contract all muscles along the arm produced an arm extension with kinematic properties similar to those of natural movements. Control of only 2 parameters fully specified the extension movement: the amplitude of the activation signal (leading to the generation of muscle force) and the activation traveling time (the time the activation wave takes to travel along the arm). We found that the same kinematics could be achieved by applying activation signals with different activation amplitudes all exceeding some minimal level. This suggests that the octopus arm could use minimal amplitudes of activation to generate the minimal muscle forces required for the production of the desired kinematics. Larger-amplitude signals would generate larger forces that increase the arm's stability against perturbations without changing the kinematic characteristics. The robustness of this phenomenon was demonstrated by examining activation signals with either a constant or a bell-shaped velocity profile. Our modeling suggests that the octopus arm biomechanics may allow independent control of kinematics and resistance to perturbation during arm extension movements.
Cell-free protein synthesis in micro compartments: building a minimal cell from biobricks.
Jia, Haiyang; Heymann, Michael; Bernhard, Frank; Schwille, Petra; Kai, Lei
2017-10-25
The construction of a minimal cell that exhibits the essential characteristics of life is a great challenge in the field of synthetic biology. Assembling a minimal cell requires multidisciplinary expertise from physics, chemistry and biology. Scientists from different backgrounds tend to define the essence of 'life' differently and have thus proposed different artificial cell models possessing one or several essential features of living cells. Using the tools and methods of molecular biology, the bottom-up engineering of a minimal cell appears in reach. However, several challenges still remain. In particular, the integration of individual sub-systems that is required to achieve a self-reproducing cell model presents a complex optimization challenge. For example, multiple self-organisation and self-assembly processes have to be carefully tuned. We review advances and developments of new methods and techniques, for cell-free protein synthesis as well as micro-fabrication, for their potential to resolve challenges and to accelerate the development of minimal cells. Copyright © 2017 Elsevier B.V. All rights reserved.
Text-Based On-Line Conferencing: A Conceptual and Empirical Analysis Using a Minimal Prototype.
ERIC Educational Resources Information Center
McCarthy, John C.; And Others
1993-01-01
Analyzes requirements for text-based online conferencing through the use of a minimal prototype. Topics discussed include prototyping with a minimal system; text-based communication; the system as a message passer versus the system as a shared data structure; and three exercises that showed how users worked with the prototype. (Contains 61…
NASA Technical Reports Server (NTRS)
1977-01-01
The 20x9 TDI array was developed to meet the LANDSAT Thematic Mapper Requirements. This array is based upon a self-aligned, transparent gate, buried channel process. The process features: (1) buried channel, four phase, overlapping gate CCD's for high transfer efficiency without fat zero; (2) self-aligned transistors to minimize clock feedthrough and parasitic capacitance; and (3) transparent tin oxide electrode for high quantum efficiency with front surface irradiation. The requirements placed on the array and the performance achieved are summarized. This data is the result of flat field measurements only, no imaging or dynamic target measurements were made during this program. Measurements were performed with two different test stands. The bench test equipment fabricated for this program operated at the 8 micro sec line time and employed simple sampling of the gated MOSFET output video signal. The second stand employed Correlated Doubled Sampling (CDS) and operated at 79.2 micro sec line time.
Part-time hospitalization programs: the neglected field of community psychiatry.
Voineskos, G.
1976-01-01
Part-time hospitalization for persons with psychiatric disorders is underdeveloped, underutilized and often poorly understood, but should be encouraged in view of the unsatisfactory living conditions of patients discharged from hospital who still require care, the reductions in psychiatric impatient populations and numbers of beds, the increasing costs of health services and the current fiscal restraints. Day and night hospitals can provide an alternative to inpatient or outpatient treatment, rehabilitation for the long-term patient or treatment for the patient in transition from inpatient to outpatient status. The day hospital can also provide a diagnostic setting. Such programs help preserve the patient's position in the family and the community, minimize the ill effects of hospitalization, and lower capital and operating costs of the psychiatric services. Awareness by medical and paramedical services of the value of these programs would increase their utilization. Shifting the emphasis of administrative and fiscal policies from inpatient to part-time hospitalization programs is also required. PMID:1253069
A high throughput MATLAB program for automated force-curve processing using the AdG polymer model.
O'Connor, Samantha; Gaddis, Rebecca; Anderson, Evan; Camesano, Terri A; Burnham, Nancy A
2015-02-01
Research in understanding biofilm formation is dependent on accurate and representative measurements of the steric forces related to brush on bacterial surfaces. A MATLAB program to analyze force curves from an AFM efficiently, accurately, and with minimal user bias has been developed. The analysis is based on a modified version of the Alexander and de Gennes (AdG) polymer model, which is a function of equilibrium polymer brush length, probe radius, temperature, separation distance, and a density variable. Automating the analysis reduces the amount of time required to process 100 force curves from several days to less than 2min. The use of this program to crop and fit force curves to the AdG model will allow researchers to ensure proper processing of large amounts of experimental data and reduce the time required for analysis and comparison of data, thereby enabling higher quality results in a shorter period of time. Copyright © 2014 Elsevier B.V. All rights reserved.
Waste reduction plan for The Oak Ridge National Laboratory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schultz, R.M.
1990-04-01
The Oak Ridge National Laboratory (ORNL) is a multipurpose Research and Development (R D) facility. These R D activities generate numerous small waste streams. Waste minimization is defined as any action that minimizes the volume or toxicity of waste by avoiding its generation or recycling. This is accomplished by material substitution, changes to processes, or recycling wastes for reuse. Waste reduction is defined as waste minimization plus treatment which results in volume or toxicity reduction. The ORNL Waste Reduction Program will include both waste minimization and waste reduction efforts. Federal regulations, DOE policies and guidelines, increased costs and liabilities associatedmore » with the management of wastes, limited disposal options and facility capacities, and public consciousness have been motivating factors for implementing comprehensive waste reduction programs. DOE Order 5820.2A, Section 3.c.2.4 requires DOE facilities to establish an auditable waste reduction program for all LLW generators. In addition, it further states that any new facilities, or changes to existing facilities, incorporate waste minimization into design considerations. A more recent DOE Order, 3400.1, Section 4.b, requires the preparation of a waste reduction program plan which must be reviewed annually and updated every three years. Implementation of a waste minimization program for hazardous and radioactive mixed wastes is sited in DOE Order 5400.3, Section 7.d.5. This document has been prepared to address these requirements. 6 refs., 1 fig., 2 tabs.« less
Silicon Valley as an Early Adopter for On-Demand Civil VTOL Operations
NASA Technical Reports Server (NTRS)
Antcliff, Kevin R.; Moore, Mark D.; Goodrich, Kenneth H.
2016-01-01
With high incomes, long commutes, severe ground geographic constraints, severe highway congestion during peak commute times, high housing costs, and near perfect year-round weather, the Silicon Valley is positioned to be an excellent early adopter market for emerging aviation On-Demand Mobility transportation solutions. Prior efforts have attempted to use existing aviation platforms (helicopters or General Aviation aircraft) with existing infrastructure solutions, or only investigated new vehicle platforms without understanding how to incorporate new vehicle types into existing built-up communities. Research has been performed with the objective of minimizing door-to-door time for "Hyper Commuters" (frequent, long-distance commuters) in the Silicon Valley through the development of new helipad infrastructure for ultra-low noise Vertical Takeoff and Landing (VTOL) aircraft. Current travel times for chosen city-pairs across urban and suburban commutes are compared to future mobility concepts that provide significantly higher utilization and productivity to yield competitive operating costs compared to existing transportation choices. Helipads are introduced near current modes of transportation and infrastructure for ease-of-access, and maximizing proximity. Strategies for both private and public infrastructure development are presented that require no new land purchase while minimizing community noise exposure. New VTOL concepts are introduced with cruise speeds of 200 mph, which yield a greater than three times improvement in overall door-to-door time when compared to current automobiles, and in some cases, improvements of up to 6 times lower trip times.
Quantum Dynamics with Short-Time Trajectories and Minimal Adaptive Basis Sets.
Saller, Maximilian A C; Habershon, Scott
2017-07-11
Methods for solving the time-dependent Schrödinger equation via basis set expansion of the wave function can generally be categorized as having either static (time-independent) or dynamic (time-dependent) basis functions. We have recently introduced an alternative simulation approach which represents a middle road between these two extremes, employing dynamic (classical-like) trajectories to create a static basis set of Gaussian wavepackets in regions of phase-space relevant to future propagation of the wave function [J. Chem. Theory Comput., 11, 8 (2015)]. Here, we propose and test a modification of our methodology which aims to reduce the size of basis sets generated in our original scheme. In particular, we employ short-time classical trajectories to continuously generate new basis functions for short-time quantum propagation of the wave function; to avoid the continued growth of the basis set describing the time-dependent wave function, we employ Matching Pursuit to periodically minimize the number of basis functions required to accurately describe the wave function. Overall, this approach generates a basis set which is adapted to evolution of the wave function while also being as small as possible. In applications to challenging benchmark problems, namely a 4-dimensional model of photoexcited pyrazine and three different double-well tunnelling problems, we find that our new scheme enables accurate wave function propagation with basis sets which are around an order-of-magnitude smaller than our original trajectory-guided basis set methodology, highlighting the benefits of adaptive strategies for wave function propagation.
NASA Astrophysics Data System (ADS)
Ouyang, Qi; Lu, Wenxi; Hou, Zeyu; Zhang, Yu; Li, Shuai; Luo, Jiannan
2017-05-01
In this paper, a multi-algorithm genetically adaptive multi-objective (AMALGAM) method is proposed as a multi-objective optimization solver. It was implemented in the multi-objective optimization of a groundwater remediation design at sites contaminated by dense non-aqueous phase liquids. In this study, there were two objectives: minimization of the total remediation cost, and minimization of the remediation time. A non-dominated sorting genetic algorithm II (NSGA-II) was adopted to compare with the proposed method. For efficiency, the time-consuming surfactant-enhanced aquifer remediation simulation model was replaced by a surrogate model constructed by a multi-gene genetic programming (MGGP) technique. Similarly, two other surrogate modeling methods-support vector regression (SVR) and Kriging (KRG)-were employed to make comparisons with MGGP. In addition, the surrogate-modeling uncertainty was incorporated in the optimization model by chance-constrained programming (CCP). The results showed that, for the problem considered in this study, (1) the solutions obtained by AMALGAM incurred less remediation cost and required less time than those of NSGA-II, indicating that AMALGAM outperformed NSGA-II. It was additionally shown that (2) the MGGP surrogate model was more accurate than SVR and KRG; and (3) the remediation cost and time increased with the confidence level, which can enable decision makers to make a suitable choice by considering the given budget, remediation time, and reliability.
Optimal Time Advance In Terminal Area Arrivals: Throughput vs. Fuel Savings
NASA Technical Reports Server (NTRS)
Sadovsky, Alexander V .; Swenson, Harry N.; Haskell, William B.; Rakas, Jasenka
2011-01-01
The current operational practice in scheduling air traffic arriving at an airport is to adjust flight schedules by delay, i.e. a postponement of an aircrafts arrival at a scheduled location, to manage safely the FAA-mandated separation constraints between aircraft. To meet the observed and forecast growth in traffic demand, however, the practice of time advance (speeding up an aircraft toward a scheduled location) is envisioned for future operations as a practice additional to delay. Time advance has two potential advantages. The first is the capability to minimize, or at least reduce, the excess separation (the distances between pairs of aircraft immediately in-trail) and thereby to increase the throughput of the arriving traffic. The second is to reduce the total traffic delay when the traffic sample is below saturation density. A cost associated with time advance is the fuel expenditure required by an aircraft to speed up. We present an optimal control model of air traffic arriving in a terminal area and solve it using the Pontryagin Maximum Principle. The admissible controls allow time advance, as well as delay, some of the way. The cost function reflects the trade-off between minimizing two competing objectives: excess separation (negatively correlated with throughput) and fuel burn. A number of instances are solved using three different methods, to demonstrate consistency of solutions.
A lightweight distributed framework for computational offloading in mobile cloud computing.
Shiraz, Muhammad; Gani, Abdullah; Ahmad, Raja Wasim; Adeel Ali Shah, Syed; Karim, Ahmad; Rahman, Zulkanain Abdul
2014-01-01
The latest developments in mobile computing technology have enabled intensive applications on the modern Smartphones. However, such applications are still constrained by limitations in processing potentials, storage capacity and battery lifetime of the Smart Mobile Devices (SMDs). Therefore, Mobile Cloud Computing (MCC) leverages the application processing services of computational clouds for mitigating resources limitations in SMDs. Currently, a number of computational offloading frameworks are proposed for MCC wherein the intensive components of the application are outsourced to computational clouds. Nevertheless, such frameworks focus on runtime partitioning of the application for computational offloading, which is time consuming and resources intensive. The resource constraint nature of SMDs require lightweight procedures for leveraging computational clouds. Therefore, this paper presents a lightweight framework which focuses on minimizing additional resources utilization in computational offloading for MCC. The framework employs features of centralized monitoring, high availability and on demand access services of computational clouds for computational offloading. As a result, the turnaround time and execution cost of the application are reduced. The framework is evaluated by testing prototype application in the real MCC environment. The lightweight nature of the proposed framework is validated by employing computational offloading for the proposed framework and the latest existing frameworks. Analysis shows that by employing the proposed framework for computational offloading, the size of data transmission is reduced by 91%, energy consumption cost is minimized by 81% and turnaround time of the application is decreased by 83.5% as compared to the existing offloading frameworks. Hence, the proposed framework minimizes additional resources utilization and therefore offers lightweight solution for computational offloading in MCC.
A Lightweight Distributed Framework for Computational Offloading in Mobile Cloud Computing
Shiraz, Muhammad; Gani, Abdullah; Ahmad, Raja Wasim; Adeel Ali Shah, Syed; Karim, Ahmad; Rahman, Zulkanain Abdul
2014-01-01
The latest developments in mobile computing technology have enabled intensive applications on the modern Smartphones. However, such applications are still constrained by limitations in processing potentials, storage capacity and battery lifetime of the Smart Mobile Devices (SMDs). Therefore, Mobile Cloud Computing (MCC) leverages the application processing services of computational clouds for mitigating resources limitations in SMDs. Currently, a number of computational offloading frameworks are proposed for MCC wherein the intensive components of the application are outsourced to computational clouds. Nevertheless, such frameworks focus on runtime partitioning of the application for computational offloading, which is time consuming and resources intensive. The resource constraint nature of SMDs require lightweight procedures for leveraging computational clouds. Therefore, this paper presents a lightweight framework which focuses on minimizing additional resources utilization in computational offloading for MCC. The framework employs features of centralized monitoring, high availability and on demand access services of computational clouds for computational offloading. As a result, the turnaround time and execution cost of the application are reduced. The framework is evaluated by testing prototype application in the real MCC environment. The lightweight nature of the proposed framework is validated by employing computational offloading for the proposed framework and the latest existing frameworks. Analysis shows that by employing the proposed framework for computational offloading, the size of data transmission is reduced by 91%, energy consumption cost is minimized by 81% and turnaround time of the application is decreased by 83.5% as compared to the existing offloading frameworks. Hence, the proposed framework minimizes additional resources utilization and therefore offers lightweight solution for computational offloading in MCC. PMID:25127245
Initial results with minimally invasive repair of pectus carinatum.
Kálmán, Attila
2009-08-01
Pectus carinatum is traditionally repaired by using some modification of the open Ravitch procedure, with its possible long-term sequelae, such as poor postoperative compliance of the chest. In this study we assessed our results with a new minimally invasive repair of pectus carinatum that requires neither cartilage incision nor sternotomy. From June 2005, we have corrected pectus carinatum using a method analogous to the Nuss procedure for pectus excavatum repair. Thus far, we performed this intervention on 14 patients (mean age, 15 +/- 1.5 years). A steel bar has been inserted at the level of the maximum point of the sternal protrusion through small lateral incisions. The sternum is pushed back without osteotomy or chondrotomy. The bar is removed after 2 years. Patients' characteristics, operation time, hospital stay, and complications have been recorded. In 1 patient with asymmetric deformity, 2 bars were placed. Operative time was 42 +/- 20 minutes (mean +/- standard deviation), and hospital stay was 3 days (median quartiles, 3-4 days) postoperatively. We experienced lateral shift of the bar in 1 patient, which was treated with remodeling and repositioning of the bar. No other complication occurred during the 18-month follow-up period (mean range, 2-38 months). Thirteen of the 14 patients reported excellent or very good results. Patients returned to full activity within 2 months. Five bars have been removed. Minimally invasive repair of pectus carinatum leaves the integrity of the chest wall untouched. It is safe with a short operative time and hospital stay and provides good results, even in asymmetric cases.
Design and Sizing of the Air Revitalization System for Altair Lunar Lander
NASA Technical Reports Server (NTRS)
Allada, Rama Kumar
2009-01-01
Designing closed-loop Air Revitalization Systems (ARS) for human spaceflight applications requires a delicate balance between designing for system robustness while minimizing system power and mass requirements. This presentation will discuss the design of the ARS for the Altair Lunar Lander. The presentation will illustrate how dynamic simulations, using Aspen Custom Modeler, were used to develop a system configuration with the ability to control atmospheric conditions under a wide variety of circumstances while minimizing system mass/volume and the impact on overall power requirements for the Lander architecture.
Minimally invasive surgery: lateral approach interbody fusion: results and review.
Youssef, Jim A; McAfee, Paul C; Patty, Catherine A; Raley, Erin; DeBauche, Spencer; Shucosky, Erin; Chotikul, Liana
2010-12-15
A retrospective review of patients treated at 2 institutions with anterior lumbar interbody fusion using a minimally invasive lateral retroperitoneal approach, and review of literature. To analyze the outcomes from historical literature and from a retrospectively compiled database of patients having undergone anterior interbody fusions performed through a lateral approach. A paucity of published literature exists describing outcomes following lateral approach fusion surgery. Patients treated with extreme lateral interbody fusion (XLIF) were identified through retrospective chart review. Treatment variables included operating room (OR) time, estimated blood loss (EBL), length of hospital stay (LOS), complications, and fusion rate. A literature review, using the National Center for Biotechnology Information databases PubMed/MEDLINE and Google Scholar, yielded 14 peer-reviewed articles reporting outcomes scoring, complications, fusion status, long-term follow-up, and radiographic assessments related to XLIF. Published XLIF results were summarized and evaluated with current study data. A total of 84 XLIF patients were included in the current cohort analysis. OR time, EBL, and length of hospital stay averaged 199 minutes, 155 mL, and 2.6 days, respectively, and perioperative and postoperative complication rates were 2.4% and 6.1%. Mean follow-up was 15.7 months. Sixty-eight patients showed evidence of solid arthrodesis and no subsidence on computed tomography and flexion/extension radiographs. Results were within the ranges of those in the literature. Literature review identified reports of significant improvements in clinical outcomes scores, radiographic measures, and cost effectiveness. Current data corroborates and contributes to the existing body of literature describing XLIF outcomes. Procedures are generally performed with short OR times, minimal EBL, and few complications. Patients recover quickly, requiring minimal hospital stay, although transient hip/thigh pain and/or weakness is common. Long-term outcomes are generally favorable, with maintained improvements in patient-reported pain and function scores as well as radiographic parameters, including high rates of fusion.
Discrete-time model reduction in limited frequency ranges
NASA Technical Reports Server (NTRS)
Horta, Lucas G.; Juang, Jer-Nan; Longman, Richard W.
1991-01-01
A mathematical formulation for model reduction of discrete time systems such that the reduced order model represents the system in a particular frequency range is discussed. The algorithm transforms the full order system into balanced coordinates using frequency weighted discrete controllability and observability grammians. In this form a criterion is derived to guide truncation of states based on their contribution to the frequency range of interest. Minimization of the criterion is accomplished without need for numerical optimization. Balancing requires the computation of discrete frequency weighted grammians. Close form solutions for the computation of frequency weighted grammians are developed. Numerical examples are discussed to demonstrate the algorithm.
Temporary bypass for superior vena cava reconstruction with Anthron bypass tubeTM
Yamasaki, Naoya; Tsuchiya, Tomoshi; Miyazaki, Takuro; Kamohara, Ryotaro; Hatachi, Go; Nagayasu, Takeshi
2017-01-01
Total superior vena cava (SVC) clamping for SVC replacement or repair can be used in thoracic surgery. A bypass technique is an option to avoid hemodynamic instability and cerebral venous hypertension and hypoperfusion. The present report describes a venous bypass technique using Anthron bypass tubeTM for total SVC clamping. Indications for this procedure include the need for a temporary bypass between the brachiocephalic vein and atrium for complete tumor resection. This procedure allows the surgeons sufficient time to complete replacement of SVC or partial resection of SVC without adverse effects. Further, it is a relatively simple procedure requiring minimal time. PMID:28840027
Li, Chunsheng; Ansari, Armin; Etherington, George; Jourdain, Jean-Rene; Kukhta, Boris; Kurihara, Osamu; Lopez, Maria Antonia; Ménétrier, Florence; dos Reis, Arlene Alves; Solomon, Stephen; Zhang, Jiangfeng; Carr, Zhanat
2017-01-01
Following a radiological or nuclear emergency, first responders and the public may become internally contaminated with radioactive materials, as demonstrated during the Goiânia, Chernobyl and Fukushima accidents. Timely monitoring of the affected populations for potential internal contamination, assessment of radiation dose and the provision of necessary medical treatment are required to minimize the health risks from the contamination. This paper summarizes the guidelines and tools that have been developed, and identifies the gaps and priorities for future projects. PMID:27521210
Parallel algorithm for computation of second-order sequential best rotations
NASA Astrophysics Data System (ADS)
Redif, Soydan; Kasap, Server
2013-12-01
Algorithms for computing an approximate polynomial matrix eigenvalue decomposition of para-Hermitian systems have emerged as a powerful, generic signal processing tool. A technique that has shown much success in this regard is the sequential best rotation (SBR2) algorithm. Proposed is a scheme for parallelising SBR2 with a view to exploiting the modern architectural features and inherent parallelism of field-programmable gate array (FPGA) technology. Experiments show that the proposed scheme can achieve low execution times while requiring minimal FPGA resources.
Ant colony system algorithm for the optimization of beer fermentation control.
Xiao, Jie; Zhou, Ze-Kui; Zhang, Guang-Xin
2004-12-01
Beer fermentation is a dynamic process that must be guided along a temperature profile to obtain the desired results. Ant colony system algorithm was applied to optimize the kinetic model of this process. During a fixed period of fermentation time, a series of different temperature profiles of the mixture were constructed. An optimal one was chosen at last. Optimal temperature profile maximized the final ethanol production and minimized the byproducts concentration and spoilage risk. The satisfactory results obtained did not require much computation effort.
NASA Technical Reports Server (NTRS)
Seldner, K.
1977-01-01
An algorithm was developed to optimally control the traffic signals at each intersection using a discrete time traffic model applicable to heavy or peak traffic. Off line optimization procedures were applied to compute the cycle splits required to minimize the lengths of the vehicle queues and delay at each intersection. The method was applied to an extensive traffic network in Toledo, Ohio. Results obtained with the derived optimal settings are compared with the control settings presently in use.
Computer modelling of grain microstructure in three dimensions
NASA Astrophysics Data System (ADS)
Narayan, K. Lakshmi
We present a program that generates the two-dimensional micrographs of a three dimensional grain microstructure. The code utilizes a novel scanning, pixel mapping technique to secure statistical distributions of surface areas, grain sizes, aspect ratios, perimeters, number of nearest neighbors and volumes of the randomly nucleated particles. The program can be used for comparing the existing theories of grain growth, and interpretation of two-dimensional microstructure of three-dimensional samples. Special features have been included to minimize the computation time and resource requirements.
NASA Technical Reports Server (NTRS)
Ghil, M.; Balgovind, R.
1979-01-01
The inhomogeneous Cauchy-Riemann equations in a rectangle are discretized by a finite difference approximation. Several different boundary conditions are treated explicitly, leading to algorithms which have overall second-order accuracy. All boundary conditions with either u or v prescribed along a side of the rectangle can be treated by similar methods. The algorithms presented here have nearly minimal time and storage requirements and seem suitable for development into a general-purpose direct Cauchy-Riemann solver for arbitrary boundary conditions.
Rapid development of Proteomic applications with the AIBench framework.
López-Fernández, Hugo; Reboiro-Jato, Miguel; Glez-Peña, Daniel; Méndez Reboredo, José R; Santos, Hugo M; Carreira, Ricardo J; Capelo-Martínez, José L; Fdez-Riverola, Florentino
2011-09-15
In this paper we present two case studies of Proteomics applications development using the AIBench framework, a Java desktop application framework mainly focused in scientific software development. The applications presented in this work are Decision Peptide-Driven, for rapid and accurate protein quantification, and Bacterial Identification, for Tuberculosis biomarker search and diagnosis. Both tools work with mass spectrometry data, specifically with MALDI-TOF spectra, minimizing the time required to process and analyze the experimental data. Copyright 2011 The Author(s). Published by Journal of Integrative Bioinformatics.
MacDonald, Russell D; Thomas, Laura; Rusk, Frederick C; Marques, Shauna D; McGuire, Dan
2010-01-01
Transport medicine personnel are potentially exposed to jet fuel combustion products. Setting-specific data are required to determine whether this poses a risk. This study assessed exposure to jet fuel combustion products, compared various engine ignition scenarios, and determined methods to minimize exposure. The Beechcraft King Air B200 turboprop aircraft equipped with twin turbine engines, using a kerosene-based jet fuel (Jet A-1), was used to measure products of combustion during boarding, engine startup, and flight in three separate engine start scenarios ("shielded": internal engine start, door closed; "exposed": ground power unit start, door open; and "minimized": ground power unit right engine start, door open). Real-time continuous monitoring equipment was used for oxygen, carbon dioxide, carbon monoxide, nitrogen dioxide, hydrogen sulfide, sulfur dioxide, volatile organic compounds, and particulate matter. Integrated methods were used for aldehydes, polycyclic aromatic hydrocarbons, volatile organic compounds, and aliphatic hydrocarbons. Samples were taken in the paramedic breathing zone for approximately 60 minutes, starting just before the paramedics boarded the aircraft. Data were compared against regulated time-weighted exposure thresholds to determine the presence of potentially harmful products of combustion. Polycyclic aromatic hydrocarbons, aldehydes, volatile organic compounds, and aliphatic hydrocarbons were found at very low concentrations or beneath the limits of detection. There were significant differences in exposures to particulates, carbon monoxide, and total volatile organic compound between the "exposed" and "minimized" scenarios. Elevated concentrations of carbon monoxide and total volatile organic compounds were present during the ground power unit-assisted dual-engine start. There were no appreciable exposures during the "minimized" or "shielded" scenarios. Air medical personnel exposures to jet fuel combustion products were generally low and did not exceed established U.S. or Canadian health and safety exposure limits. Avoidance of ground power unit-assisted dual-engine starts and closing the hangar door prior to start minimize or eliminate the occupational exposure.
NASA Astrophysics Data System (ADS)
Fung, Kenneth K. H.; Lewis, Geraint F.; Wu, Xiaofeng
2017-04-01
A vast wealth of literature exists on the topic of rocket trajectory optimisation, particularly in the area of interplanetary trajectories due to its relevance today. Studies on optimising interstellar and intergalactic trajectories are usually performed in flat spacetime using an analytical approach, with very little focus on optimising interstellar trajectories in a general relativistic framework. This paper examines the use of low-acceleration rockets to reach galactic destinations in the least possible time, with a genetic algorithm being employed for the optimisation process. The fuel required for each journey was calculated for various types of propulsion systems to determine the viability of low-acceleration rockets to colonise the Milky Way. The results showed that to limit the amount of fuel carried on board, an antimatter propulsion system would likely be the minimum technological requirement to reach star systems tens of thousands of light years away. However, using a low-acceleration rocket would require several hundreds of thousands of years to reach these star systems, with minimal time dilation effects since maximum velocities only reached about 0.2 c . Such transit times are clearly impractical, and thus, any kind of colonisation using low acceleration rockets would be difficult. High accelerations, on the order of 1 g, are likely required to complete interstellar journeys within a reasonable time frame, though they may require prohibitively large amounts of fuel. So for now, it appears that humanity's ultimate goal of a galactic empire may only be possible at significantly higher accelerations, though the propulsion technology requirement for a journey that uses realistic amounts of fuel remains to be determined.
NASA Astrophysics Data System (ADS)
Niakan, F.; Vahdani, B.; Mohammadi, M.
2015-12-01
This article proposes a multi-objective mixed-integer model to optimize the location of hubs within a hub network design problem under uncertainty. The considered objectives include minimizing the maximum accumulated travel time, minimizing the total costs including transportation, fuel consumption and greenhouse emissions costs, and finally maximizing the minimum service reliability. In the proposed model, it is assumed that for connecting two nodes, there are several types of arc in which their capacity, transportation mode, travel time, and transportation and construction costs are different. Moreover, in this model, determining the capacity of the hubs is part of the decision-making procedure and balancing requirements are imposed on the network. To solve the model, a hybrid solution approach is utilized based on inexact programming, interval-valued fuzzy programming and rough interval programming. Furthermore, a hybrid multi-objective metaheuristic algorithm, namely multi-objective invasive weed optimization (MOIWO), is developed for the given problem. Finally, various computational experiments are carried out to assess the proposed model and solution approaches.
Estimation of anomaly location and size using electrical impedance tomography.
Kwon, Ohin; Yoon, Jeong Rock; Seo, Jin Keun; Woo, Eung Je; Cho, Young Gu
2003-01-01
We developed a new algorithm that estimates locations and sizes of anomalies in electrically conducting medium based on electrical impedance tomography (EIT) technique. When only the boundary current and voltage measurements are available, it is not practically feasible to reconstruct accurate high-resolution cross-sectional conductivity or resistivity images of a subject. In this paper, we focus our attention on the estimation of locations and sizes of anomalies with different conductivity values compared with the background tissues. We showed the performance of the algorithm from experimental results using a 32-channel EIT system and saline phantom. With about 1.73% measurement error in boundary current-voltage data, we found that the minimal size (area) of the detectable anomaly is about 0.72% of the size (area) of the phantom. Potential applications include the monitoring of impedance related physiological events and bubble detection in two-phase flow. Since this new algorithm requires neither any forward solver nor time-consuming minimization process, it is fast enough for various real-time applications in medicine and nondestructive testing.
Athletes and blood clots: individualized, intermittent anticoagulation management.
Berkowitz, J N; Moll, S
2017-06-01
Essentials Athletes on anticoagulants are typically prohibited from participation in contact sports. Short-acting anticoagulants allow for reconsideration of this precedent. An individualized pharmacokinetic/pharmacodynamics study can aid patient-specific management. Many challenges and unresolved issues exist regarding such tailored intermittent dosing. Athletes with venous thromboembolism (VTE) are typically prohibited from participating in contact sports during anticoagulation therapy, but such mandatory removal from competition can cause psychological and financial detriments for athletes and overlooks patient autonomy. The precedent of compulsory removal developed when options for anticoagulation therapy were more limited, but medical advances now allow for rethinking of the management of athletes with VTE. We propose a novel therapeutic approach to the treatment of athletes who participate in contact sports and require anticoagulation. A personalized pharmacokinetic/pharmacodynamics study of a direct oral anticoagulant can be performed for an athlete, which can inform the timing of medication dosing. Managed carefully, this can allow athletic participation when plasma drug concentration is minimal (minimizing bleeding risk) and prompt resumption of treatment after the risk of bleeding sufficiently normalizes (maximizing therapeutic time). © 2017 International Society on Thrombosis and Haemostasis.
Process for laser machining and surface treatment
Neil, George R.; Shinn, Michelle D.
2004-10-26
An improved method and apparatus increasing the accuracy and reducing the time required to machine materials, surface treat materials, and allow better control of defects such as particulates in pulsed laser deposition. The speed and quality of machining is improved by combining an ultrashort pulsed laser at high average power with a continuous wave laser. The ultrashort pulsed laser provides an initial ultrashort pulse, on the order of several hundred femtoseconds, to stimulate an electron avalanche in the target material. Coincident with the ultrashort pulse or shortly after it, a pulse from a continuous wave laser is applied to the target. The micromachining method and apparatus creates an initial ultrashort laser pulse to ignite the ablation followed by a longer laser pulse to sustain and enlarge on the ablation effect launched in the initial pulse. The pulse pairs are repeated at a high pulse repetition frequency and as often as desired to produce the desired micromachining effect. The micromachining method enables a lower threshold for ablation, provides more deterministic damage, minimizes the heat affected zone, minimizes cracking or melting, and reduces the time involved to create the desired machining effect.
Efficient spares matrix multiplication scheme for the CYBER 203
NASA Technical Reports Server (NTRS)
Lambiotte, J. J., Jr.
1984-01-01
This work has been directed toward the development of an efficient algorithm for performing this computation on the CYBER-203. The desire to provide software which gives the user the choice between the often conflicting goals of minimizing central processing (CPU) time or storage requirements has led to a diagonal-based algorithm in which one of three types of storage is selected for each diagonal. For each storage type, an initialization sub-routine estimates the CPU and storage requirements based upon results from previously performed numerical experimentation. These requirements are adjusted by weights provided by the user which reflect the relative importance the user places on the resources. The three storage types employed were chosen to be efficient on the CYBER-203 for diagonals which are sparse, moderately sparse, or dense; however, for many densities, no diagonal type is most efficient with respect to both resource requirements. The user-supplied weights dictate the choice.
DATA QUALITY OBJECTIVES FOR SELECTING WASTE SAMPLES FOR THE BENCH STEAM REFORMER TEST
DOE Office of Scientific and Technical Information (OSTI.GOV)
BANNING DL
2010-08-03
This document describes the data quality objectives to select archived samples located at the 222-S Laboratory for Fluid Bed Steam Reformer testing. The type, quantity and quality of the data required to select the samples for Fluid Bed Steam Reformer testing are discussed. In order to maximize the efficiency and minimize the time to treat Hanford tank waste in the Waste Treatment and Immobilization Plant, additional treatment processes may be required. One of the potential treatment processes is the fluid bed steam reformer (FBSR). A determination of the adequacy of the FBSR process to treat Hanford tank waste is required.more » The initial step in determining the adequacy of the FBSR process is to select archived waste samples from the 222-S Laboratory that will be used to test the FBSR process. Analyses of the selected samples will be required to confirm the samples meet the testing criteria.« less
Stone, Christopher M.; Williams, Derrick C.; Price, Jeremy P.
2016-09-23
The Extended Q-Range Small-Angle Neutron Scattering Diffractometer (EQ-SANS) instrument at the Spallation Neutron Source (SNS), Oak Ridge, Tennessee, incorporates a 69m3 detector vessel with a vacuum system which required an upgrade with respect to performance, ease of operation, and maintenance. The upgrade focused on improving pumping performance as well as optimizing system design to minimize opportunity for operational error. This upgrade provided the following practical contributions: Reduced time required to evacuate from atmospheric pressure to 2mTorr from 500-1,000 minutes to 60-70 minutes Provided turn-key automated control with a multi-faceted interlock for personnel and machine safety.
Schmid, Georg H.; Gaffron, Hans
1967-01-01
Neither an over-all deficiency of chlorophyll, nor an increased enzymatic capacity for maximal rates, nor an unusual lamellar structure was found to change the number of quanta required for the evolution of one molecule of oxygen in healthy aurea mutants of tobacco. The average minimal quantum number remains 10 (efficiency 0.1) as in many algae and typical higher plants. Most of the time the optimal efficiency depends on the availability of some far-red radiation, particularly in the blue region of the spectrum where blue light alone is rather inefficient. These results fit an explanation offered earlier in connection with the hydrogen or acetate photometabolism of algae in far-red light. PMID:19873573
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stone, Christopher M.; Williams, Derrick C.; Price, Jeremy P.
The Extended Q-Range Small-Angle Neutron Scattering Diffractometer (EQ-SANS) instrument at the Spallation Neutron Source (SNS), Oak Ridge, Tennessee, incorporates a 69m3 detector vessel with a vacuum system which required an upgrade with respect to performance, ease of operation, and maintenance. The upgrade focused on improving pumping performance as well as optimizing system design to minimize opportunity for operational error. This upgrade provided the following practical contributions: Reduced time required to evacuate from atmospheric pressure to 2mTorr from 500-1,000 minutes to 60-70 minutes Provided turn-key automated control with a multi-faceted interlock for personnel and machine safety.
Advanced Mirror Technology Development (AMTD) Thermal Trade Studies
NASA Technical Reports Server (NTRS)
Brooks, Thomas; Stahl, Phil; Arnold, Bill
2015-01-01
Advanced Mirror Technology Development (AMTD) is being done at Marshall Space Flight Center (MSFC) in preparation for the next Ultraviolet, Optical, Infrared (UVOIR) space observatory. A likely science mission of that observatory is the detection and characterization of 'Earth-like' exoplanets. Direct exoplanet observation requires a telescope to see a planet that is 10-10 times dimmer than its host star. To accomplish this using an internal coronagraph requires a telescope with an ultra-stable wavefront. This paper investigates two topics: 1) parametric relationships between a primary mirror's thermal parameters and wavefront stability, and 2) optimal temperature profiles in the telescope's shroud and heater plate that minimize static wavefront error (WFE) in the primary mirror.
A mixed-integer linear programming approach to the reduction of genome-scale metabolic networks.
Röhl, Annika; Bockmayr, Alexander
2017-01-03
Constraint-based analysis has become a widely used method to study metabolic networks. While some of the associated algorithms can be applied to genome-scale network reconstructions with several thousands of reactions, others are limited to small or medium-sized models. In 2015, Erdrich et al. introduced a method called NetworkReducer, which reduces large metabolic networks to smaller subnetworks, while preserving a set of biological requirements that can be specified by the user. Already in 2001, Burgard et al. developed a mixed-integer linear programming (MILP) approach for computing minimal reaction sets under a given growth requirement. Here we present an MILP approach for computing minimum subnetworks with the given properties. The minimality (with respect to the number of active reactions) is not guaranteed by NetworkReducer, while the method by Burgard et al. does not allow specifying the different biological requirements. Our procedure is about 5-10 times faster than NetworkReducer and can enumerate all minimum subnetworks in case there exist several ones. This allows identifying common reactions that are present in all subnetworks, and reactions appearing in alternative pathways. Applying complex analysis methods to genome-scale metabolic networks is often not possible in practice. Thus it may become necessary to reduce the size of the network while keeping important functionalities. We propose a MILP solution to this problem. Compared to previous work, our approach is more efficient and allows computing not only one, but even all minimum subnetworks satisfying the required properties.
Cathcart, Paul; Murphy, Declan G; Moon, Daniel; Costello, Anthony J; Frydenberg, Mark
2011-04-01
• To systematically review the current literature concerning perioperative, functional and oncological outcomes reported after open and minimally invasive prostate cancer surgery specifically from institutions within Australasia. • Four electronic databases were searched to identify studies reporting outcome after open and minimally invasive prostate cancer surgery. Studies were sought using the search term 'radical prostatectomy'. • In all, 11,378 articles were retrieved. For the purpose of this review, data were only extracted from studies reporting Australasian experience. • A total of 28 studies met final inclusion criteria. • Overall, the data are limited by the low methodological quality of available studies. • Only two comparative studies evaluating open radical prostatectomy (ORP) and robotic-assisted laparoscopic RP (RALP) were identified, both non-randomized. • The mean blood loss, catheterization time and hospital stay was shorter after RALP than with ORP. In contrast, mean operative procedure time was significantly longer for RALP. • Overall adverse event rates were similar for the different surgical approaches although the rate of bladder neck stricture was significantly higher after open RP. • Incorporation of patient outcomes achieved by surgeons still within their learning curve resulted in a trend towards higher positive surgical margin rates and lower continence scores after RALP. However, there was equivalence once the surgeons' learning curve was overcome. Given the limited follow-up for RALP and laparoscopic RP (14.7 and 6 months vs 43.8 months for ORP) and the lack of data concerning erectile function status, comparison of biochemical failure and potency was not possible. • Few comparative data are available from Australasia concerning open and minimally invasive prostate cancer surgery. • While perioperative outcomes appear to favour minimally invasive approaches, further comparative assessment of functional and long-term oncological efficacy for the different surgical approaches is required to better define the role of minimally invasive approaches. © 2011 THE AUTHORS. BJU INTERNATIONAL © 2011 BJU INTERNATIONAL.
Space Shuttle Main Engine - The Relentless Pursuit of Improvement
NASA Technical Reports Server (NTRS)
VanHooser, Katherine P.; Bradley, Douglas P.
2011-01-01
The Space Shuttle Main Engine (SSME) is the only reusable large liquid rocket engine ever developed. The specific impulse delivered by the staged combustion cycle, substantially higher than previous rocket engines, minimized volume and weight for the integrated vehicle. The dual pre-burner configuration permitted precise mixture ratio and thrust control while the fully redundant controller and avionics provided a very high degree of system reliability and health diagnosis. The main engine controller design was the first rocket engine application to incorporate digital processing. The engine was required to operate at a high chamber pressure to minimize engine volume and weight. Power level throttling was required to minimize structural loads on the vehicle early in flight and acceleration levels on the crew late in ascent. Fatigue capability, strength, ease of assembly and disassembly, inspectability, and materials compatibility were all major considerations in achieving a fully reusable design. During the multi-decade program the design evolved substantially using a series of block upgrades. A number of materials and manufacturing challenges were encountered throughout SSME s history. Significant development was required for the final configuration of the high pressure turbopumps. Fracture control was implemented to assess life limits of critical materials and components. Survival in the hydrogen environment required assessment of hydrogen embrittlement. Instrumentation systems were a challenge due to the harsh thermal and dynamic environments within the engine. Extensive inspection procedures were developed to assess the engine components between flights. The Space Shuttle Main Engine achieved a remarkable flight performance record. All flights were successful with only one mission requiring an ascent abort condition, which still resulted in an acceptable orbit and mission. This was achieved in large part via extensive ground testing to fully characterize performance and to establish acceptable life limits. During the program over a million seconds of accumulated test and flight time was achieved. Post flight inspection and assessment was a key part of assuring proper performance of the flight hardware. By the end of the program the predicted reliability had improved by a factor of four. These unique challenges, evolution of the design, and the resulting reliability will be discussed in this paper.
Modeling Soil Moisture in Support of the Revegetation of Military Lands in Arid Regions.
NASA Astrophysics Data System (ADS)
Caldwell, T. G.; McDonald, E. V.; Young, M. H.
2003-12-01
The National Training Center (NTC), the Army's primary mechanized maneuver training facility, covers approximately 2600 km2 within the Mojave Desert in southern California, and is the subject of ongoing studies to support the sustainability of military lands in desert environments. Revegetation of these lands by the Integrated Training Areas Management (ITAM) Program requires the identification of optimum growing conditions to reestablish desert vegetation from seed and seedling, especially with regard to the timing and abundance of plant-available water. Water content, soil water potential, and soil temperature were continuously monitored and used to calibrate the Simultaneous Heat And Water (SHAW) model at 3 re-seeded sites. Modeled irrigation scenarios were used to further evaluate the most effective volume, frequency, and timing of irrigation required to maximize revegetation success and minimize water use. Surface treatments including straw mulch, gravel mulch, soil tackifier and plastic sheet
36 CFR 223.218 - Consistency with plans, environmental standards, and other management requirements.
Code of Federal Regulations, 2010 CFR
2010-07-01
...) Minimize soil erosion; (e) Maintain favorable conditions of water flow and quality; (f) Minimize adverse effects on, protect, or enhance other national forest resources, uses, and improvements; and (g) Deposit...
The Euclid AOCS science mode design
NASA Astrophysics Data System (ADS)
Bacchetta, A.; Saponara, M.; Torasso, A.; Saavedra Criado, G.; Girouart, B.
2015-06-01
Euclid is a Medium-Class mission of the ESA Cosmic Vision 2015-2025 plan. Thales Alenia Space Italy has been selected as prime contractor for the Euclid design and implementation. The spacecraft will be launched in 2020 on a Soyuz launch vehicle from Kourou, to a large-amplitude orbit around the sun-earth libration point L2. The objective of Euclid is to understand the origin of the Universe's accelerating expansion, by mapping large-scale structure over a cosmic time covering the last 10 billion years. The mission requires the ability to survey a large fraction of the extragalactic sky (i.e. portion of sky with latitude higher than 30 deg with respect to galactic plane) over its lifetime, with very high system stability (telescope, focal plane, spacecraft pointing) to minimize systematic effects. The AOCS is a key element to meet the scientific requirements. The AOCS design drivers are pointing performance and image quality (Relative Pointing Error over 700 s less than 25 m as, 68 % confidence level), and minimization of slew time between observation fields to meet the goal of completing the Wide Extragalactic Survey in 6 years. The first driver demands a Fine Guidance Sensor in the telescope focal plane for accurate attitude measurement and actuators with low noise and fine command resolution. The second driver requires high-torque actuators and an extended attitude control bandwidth. In the design, reaction wheels (RWL) and cold-gas micro-propulsion (MPS) are used in a synergetic and complementary way during different operational phases of the science mode. The RWL are used for performing the field slews, whereas during scientific observation they are stopped not to perturb the pointing by additional mechanical noise. The MPS is used for maintaining the reference attitude with high pointing accuracy during the scientific observation. This unconventional concept achieves the pointing performance with the shortest maneuver times, with significant mass savings with respect to the MPS-only solution.
Park, Sung Joon; Jeong, Woo-Jin; Ahn, Soon-Hyun
2017-11-01
The purpose of this study was to propose a novel, minimally invasive transaxillary approach for harvesting the scapular tip and latissimus dorsi osteomyogenous free flap for the reconstruction of a maxillectomy defect. A retrospective case series study of 4 patients who underwent reconstruction using a scapular tip composite free flap through the transaxillary approach was conducted. The data (age, sex, pathology, previous treatment and adjuvant treatment) were collected and analysed. Total operation time, number of hospital days and the cosmetic and functional outcome of reconstruction were analysed. Two male and two female patients were enrolled in this study. The patients' ages ranged from 52 to 59 years. All the patients had maxillectomy defects, with at least a classification of Okay type II, which were successfully reconstructed using a scapular tip and latissimus dorsi free flap through a minimally invasive transaxillary approach. The entire operation time for the primary tumour surgery and reconstruction ranged from 6.2 to 12.1 h (mean, 11.1 h). The average length of the hospital stay was 13 days (range, 10-16 days). No major donor site morbidity was observed, and there was no graft failure that required revision or exploration surgery. The minimally invasive transaxillary approach for harvesting the scapular tip and latissimus dorsi osteomyogenous free flap for the reconstruction of maxillectomy defect is a promising approach for more favourable functional and aesthetic outcomes when compared to the use of other bone containing free flaps and the classic approach for harvesting scapular tip and latissimus dorsi free flap. Copyright © 2017 British Association of Plastic, Reconstructive and Aesthetic Surgeons. Published by Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Hamada, Aulia; Rosyidi, Cucuk Nur; Jauhari, Wakhid Ahmad
2017-11-01
Minimizing processing time in a production system can increase the efficiency of a manufacturing company. Processing time are influenced by application of modern technology and machining parameter. Application of modern technology can be apply by use of CNC machining, one of the machining process can be done with a CNC machining is turning. However, the machining parameters not only affect the processing time but also affect the environmental impact. Hence, optimization model is needed to optimize the machining parameters to minimize the processing time and environmental impact. This research developed a multi-objective optimization to minimize the processing time and environmental impact in CNC turning process which will result in optimal decision variables of cutting speed and feed rate. Environmental impact is converted from environmental burden through the use of eco-indicator 99. The model were solved by using OptQuest optimization software from Oracle Crystal Ball.
CAVE3: A general transient heat transfer computer code utilizing eigenvectors and eigenvalues
NASA Technical Reports Server (NTRS)
Palmieri, J. V.; Rathjen, K. A.
1978-01-01
The method of solution is a hybrid analytical numerical technique which utilizes eigenvalues and eigenvectors. The method is inherently stable, permitting large time steps even with the best of conductors with the finest of mesh sizes which can provide a factor of five reduction in machine time compared to conventional explicit finite difference methods when structures with small time constants are analyzed over long time periods. This code will find utility in analyzing hypersonic missile and aircraft structures which fall naturally into this class. The code is a completely general one in that problems involving any geometry, boundary conditions and materials can be analyzed. This is made possible by requiring the user to establish the thermal network conductances between nodes. Dynamic storage allocation is used to minimize core storage requirements. This report is primarily a user's manual for CAVE3 code. Input and output formats are presented and explained. Sample problems are included which illustrate the usage of the code as well as establish the validity and accuracy of the method.
Kearney, Peter; Li, Wen-Chin; Yu, Chung-San; Braithwaite, Graham
2018-06-26
This research investigated controller' situation awareness by comparing COOPANS's acoustic alerts with newly designed semantic alerts. The results demonstrate that ATCOs' visual scan patterns had significant differences between acoustic and semantic designs. ATCOs established different eye movement patterns on fixations number, fixation duration and saccade velocity. Effective decision support systems require human-centred design with effective stimuli to direct ATCO's attention to critical events. It is necessary to provide ATCOs with specific alerting information to reflect the nature of of the critical situation in order to minimize the side-effects of startle and inattentional deafness. Consequently, the design of a semantic alert can significantly reduce ATCOs' response time, therefore providing valuable extra time in a time-limited situation to formulate and execute resolution strategies in critical air safety events. The findings of this research indicate that the context-specified design of semantic alerts could improve ATCO's situational awareness and significantly reduce response time in the event of Short Term Conflict Alert activation which alerts to two aircraft having less than the required lateral or vertical separation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dubart, Philippe; Hautot, Felix; Morichi, Massimo
Good management of dismantling and decontamination (D and D) operations and activities is requiring safety, time saving and perfect radiological knowledge of the contaminated environment as well as optimization for personnel dose and minimization of waste volume. In the same time, Fukushima accident has imposed a stretch to the nuclear measurement operational approach requiring in such emergency situation: fast deployment and intervention, quick analysis and fast scenario definition. AREVA, as return of experience from his activities carried out at Fukushima and D and D sites has developed a novel multi-sensor solution as part of his D and D research, approachmore » and method, a system with real-time 3D photo-realistic spatial radiation distribution cartography of contaminated premises. The system may be hand-held or mounted on a mobile device (robot, drone, e.g). In this paper, we will present our current development based on a SLAM technology (Simultaneous Localization And Mapping) and integrated sensors and detectors allowing simultaneous topographic and radiological (dose rate and/or spectroscopy) data acquisitions. This enabling technology permits 3D gamma activity cartography in real-time. (authors)« less
Congestion relief by travel time minimization in near real time : Detroit area I-75 corridor study.
DOT National Transportation Integrated Search
2008-12-01
"This document summarizes the activities concerning the project: Congestion Relief by : Travel Time Minimization in Near Real Time -- Detroit Area I-75 Corridor Study since : the inception of the project (Nov. 22, 2006 through September 30, 2008). : ...
Claassen, Cindy; Kashner, T Michael; Kashner, Tetyana K; Xuan, Lei; Larkin, Gregory L
2011-01-01
Adequate preparedness for acts of terrorism and mass violence requires a thorough understanding of the postdisaster mental health needs of all exposed groups, including those watching such events from a distance. This study examined emergency psychiatric treatment-seeking patterns following media exposure to four national terrorist or mass casualty events. An event was selected for study if (a) it precipitated local front-page headlines for >5 consecutive days and (b) emergency service psychiatrists identified it as specifically precipitating help-seeking in the study hospital. Four events qualified: the Oklahoma City bombing (1995), the Columbine High School (1999) and Wedgewood Baptist Church (1999) shootings and the terrorist attacks of September 11, 2001. Time-series analyses were used to correct for autocorrelation in visit patterns during the postdisaster week, and equivalent time periods from years before and after each event were used as control years. Overall, disaster week census did not differ significantly from predisaster weeks, although 3-day nonsignificant decreases in visit rate were observed following each disaster. Treatment-seeking for anxiety-related issues showed a nonsignificant increase following each disaster, which became significant in the "all disaster" model (t=5.17; P=.006). Intensity of media coverage did not impact rate of help-seeking in any analysis. Although these sentinel US disasters varied in scope, method, geographic proximity to the study site, perpetrator characteristics, public response, sequelae and degree of media coverage, the extent to which they impacted emergency department treatment-seeking was minimal. Geographically distant mass violence and disaster events of the type and scope studied here may require only minimal mental health "surge capacity" in the days following the event. Copyright © 2011 Elsevier Inc. All rights reserved.
More realistic power estimation for new user, active comparator studies: an empirical example.
Gokhale, Mugdha; Buse, John B; Pate, Virginia; Marquis, M Alison; Stürmer, Til
2016-04-01
Pharmacoepidemiologic studies are often expected to be sufficiently powered to study rare outcomes, but there is sequential loss of power with implementation of study design options minimizing bias. We illustrate this using a study comparing pancreatic cancer incidence after initiating dipeptidyl-peptidase-4 inhibitors (DPP-4i) versus thiazolidinediones or sulfonylureas. We identified Medicare beneficiaries with at least one claim of DPP-4i or comparators during 2007-2009 and then applied the following steps: (i) exclude prevalent users, (ii) require a second prescription of same drug, (iii) exclude prevalent cancers, (iv) exclude patients age <66 years and (v) censor for treatment changes during follow-up. Power to detect hazard ratios (effect measure strongly driven by the number of events) ≥ 2.0 estimated after step 5 was compared with the naïve power estimated prior to step 1. There were 19,388 and 28,846 DPP-4i and thiazolidinedione initiators during 2007-2009. The number of drug initiators dropped most after requiring a second prescription, outcomes dropped most after excluding patients with prevalent cancer and person-time dropped most after requiring a second prescription and as-treated censoring. The naïve power (>99%) was considerably higher than the power obtained after the final step (~75%). In designing new-user active-comparator studies, one should be mindful how steps minimizing bias affect sample-size, number of outcomes and person-time. While actual numbers will depend on specific settings, application of generic losses in percentages will improve estimates of power compared with the naive approach mostly ignoring steps taken to increase validity. Copyright © 2015 John Wiley & Sons, Ltd.
Ahmad, Arshad; Kant, Rama; Gupta, Avneet
2013-08-01
Both Doppler-guided hemorrhoidal artery ligation (DG-HAL) and infrared coagulation (IRC) are well-established techniques in the management of hemorrhoids. The aim of the study is to compare the clinical outcomes of DG-HAL and IRC in the patients with grade 1 and 2 hemorrhoids. A total of 296 patients were registered for the study, but 51 patients were lost in follow-up; hence, finally 245 patients were included in the analysis. Patients were randomized into two groups (mean age, 42 years; range, 19-60 years). Group A (n = 116) was treated with DG-HAL and group B (n = 129) was treated with IRC. Patients were examined at 1 week, 1 month, and 6 months after the procedure. Mean time taken for HAL was 21 min and for IRC, 12 min. The cost of the DG-HAL procedure was 1,440 rupees ($31.53) and that of IRC was 376 rupees ($8). The mean duration of hospital stay after HAL was 6 h and after IRC, 2 h. Control of symptoms with HAL was 96 %, whereas with IRC, 81 %. Postoperative complication rate for HAL was 2 %, whereas for IRC, 13 %. Requirement of repeat procedure with HAL was 9 % and with IRC, 28 %. Both the procedures are minimally invasive, associated with minimal discomfort, and suitable for day care surgery. IRC requires lesser procedure time, lesser postoperative hospital stay, and has lower procedure cost, whereas DG-HAL is more effective in controlling symptoms of hemorrhoids, has lower post operative complication rate, and has lesser requirement of repeat procedure.
Establishment of a rotor model basis
NASA Technical Reports Server (NTRS)
Mcfarland, R. E.
1982-01-01
Radial-dimension computations in the RSRA's blade-element model are modified for both the acquisition of extensive baseline data and for real-time simulation use. The baseline data, which are for the evaluation of model changes, use very small increments and are of high quality. The modifications to the real-time simulation model are for accuracy improvement, especially when a minimal number of blade segments is required for real-time synchronization. An accurate technique for handling tip loss in discrete blade models is developed. The mathematical consistency and convergence properties of summation algorithms for blade forces and moments are examined and generalized integration coefficients are applied to equal-annuli midpoint spacing. Rotor conditions identified as 'constrained' and 'balanced' are used and the propagation of error is analyzed.
NASA Technical Reports Server (NTRS)
Hou, Gene
2004-01-01
The focus of this research is on the development of analysis and sensitivity analysis equations for nonlinear, transient heat transfer problems modeled by p-version, time discontinuous finite element approximation. The resulting matrix equation of the state equation is simply in the form ofA(x)x = c, representing a single step, time marching scheme. The Newton-Raphson's method is used to solve the nonlinear equation. Examples are first provided to demonstrate the accuracy characteristics of the resultant finite element approximation. A direct differentiation approach is then used to compute the thermal sensitivities of a nonlinear heat transfer problem. The report shows that only minimal coding effort is required to enhance the analysis code with the sensitivity analysis capability.
On-loom, real-time, noncontact detection of fabric defects by ultrasonic imaging.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chien, H. T.
1998-09-08
A noncontact, on-loom ultrasonic inspection technique was developed for real-time 100% defect inspection of fabrics. A prototype was built and tested successfully on loom. The system is compact, rugged, low cost, requires minimal maintenance, is not sensitive to fabric color and vibration, and can easily be adapted to current loom configurations. Moreover, it can detect defects in both the pick and warp directions. The system is capable of determining the size, location, and orientation of each defect. To further improve the system, air-coupled transducers with higher efficiency and sensitivity need to be developed. Advanced detection algorithms also need to bemore » developed for better classification and categorization of defects in real-time.« less
Applying Squeaky-Wheel Optimization Schedule Airborne Astronomy Observations
NASA Technical Reports Server (NTRS)
Frank, Jeremy; Kuerklue, Elif
2004-01-01
We apply the Squeaky Wheel Optimization (SWO) algorithm to the problem of scheduling astronomy observations for the Stratospheric Observatory for Infrared Astronomy, an airborne observatory. The problem contains complex constraints relating the feasibility of an astronomical observation to the position and time at which the observation begins, telescope elevation limits, special use airspace, and available fuel. Solving the problem requires making discrete choices (e.g. selection and sequencing of observations) and continuous ones (e.g. takeoff time and setting up observations by repositioning the aircraft). The problem also includes optimization criteria such as maximizing observing time while simultaneously minimizing total flight time. Previous approaches to the problem fail to scale when accounting for all constraints. We describe how to customize SWO to solve this problem, and show that it finds better flight plans, often with less computation time, than previous approaches.
Task analysis method for procedural training curriculum development.
Riggle, Jakeb D; Wadman, Michael C; McCrory, Bernadette; Lowndes, Bethany R; Heald, Elizabeth A; Carstens, Patricia K; Hallbeck, M Susan
2014-06-01
A central venous catheter (CVC) is an important medical tool used in critical care and emergent situations. Integral to proper care in many circumstances, insertion of a CVC introduces the risk of central line-associated blood stream infections and mechanical adverse events; proper training is important for safe CVC insertion. Cognitive task analysis (CTA) methods have been successfully implemented in the medical field to improve the training of postgraduate medical trainees, but can be very time-consuming to complete and require a significant time commitment from many subject matter experts (SMEs). Many medical procedures such as CVC insertion are linear processes with well-documented procedural steps. These linear procedures may not require a traditional CTA to gather the information necessary to create a training curriculum. Accordingly, a novel, streamlined CTA method designed primarily to collect cognitive cues for linear procedures was developed to be used by medical professionals with minimal CTA training. This new CTA methodology required fewer trained personnel, fewer interview sessions, and less time commitment from SMEs than a traditional CTA. Based on this study, a streamlined CTA methodology can be used to efficiently gather cognitive information on linear medical procedures for the creation of resident training curricula and procedural skills assessments.
Cutter Connectivity Bandwidth Study
NASA Astrophysics Data System (ADS)
2002-10-01
The goal of this study was to determine how much bandwidth is required for cutters to meet emerging data transfer requirements. The Cutter Connectivity Business Solutions Team with guidance front the Commandant's 5 Innovation Council sponsored this study. Today, many Coast Guard administrative and business functions are being conducted via electronic means. Although our larger cutters can establish part-time connectivity using commercial satellite communications (SATCOM) while underway, there are numerous complaints regarding poor application performance. Additionally, smaller cutters do not have any standard means of underway connectivity. The R&D study shows the most important factor affecting web performance and enterprise applications onboard cutters was latency. Latency describes the time it takes the signal to reach the satellite and come back down through space. The latency due to use of higher orbit satellites is causing poor application performance and inefficient use of expensive SATCOM links. To improve performance, the CC must, (1) reduce latency by using alternate communications links such as low-earth orbit satellites, (2) tailor applications to the SATCOM link and/or (3) optimize protocols used for data communication to minimize time required by present applications to establish communications between the user and the host systems.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aponte, C.I.
F and H Tank Farms generate supernate and sludge contaminated Low-Level Waste. The waste is collected, characterized, and packaged for disposal. Before the waste can be disposed of, however, it must be properly characterized. Since the radionuclide distribution in typical supernate is well known, its characterization is relatively straight forward and requires minimal effort. Non-routine waste, including potentially sludge contaminated, requires much more effort to effectively characterize. The radionuclide distribution must be determined. In some cases the waste can be contaminated by various sludge transfers with unique radionuclide distributions. In these cases, the characterization can require an extensive effort. Evenmore » after an extensive characterization effort, the container must still be prepared for shipping. Therefore a significant amount of time may elapse from the time the waste is generated until the time of disposal. During the time it is possible for a tornado or high wind scenario to occur. The purpose of this report is to determine the effect of a tornado on potential sludge contaminated waste, or Transuranic (TRU) waste in B-25s [large storage containers], to evaluate the potential impact on F and H Tank Farms, and to help establish a B-25 control program for tornado events.« less
Real-time model learning using Incremental Sparse Spectrum Gaussian Process Regression.
Gijsberts, Arjan; Metta, Giorgio
2013-05-01
Novel applications in unstructured and non-stationary human environments require robots that learn from experience and adapt autonomously to changing conditions. Predictive models therefore not only need to be accurate, but should also be updated incrementally in real-time and require minimal human intervention. Incremental Sparse Spectrum Gaussian Process Regression is an algorithm that is targeted specifically for use in this context. Rather than developing a novel algorithm from the ground up, the method is based on the thoroughly studied Gaussian Process Regression algorithm, therefore ensuring a solid theoretical foundation. Non-linearity and a bounded update complexity are achieved simultaneously by means of a finite dimensional random feature mapping that approximates a kernel function. As a result, the computational cost for each update remains constant over time. Finally, algorithmic simplicity and support for automated hyperparameter optimization ensures convenience when employed in practice. Empirical validation on a number of synthetic and real-life learning problems confirms that the performance of Incremental Sparse Spectrum Gaussian Process Regression is superior with respect to the popular Locally Weighted Projection Regression, while computational requirements are found to be significantly lower. The method is therefore particularly suited for learning with real-time constraints or when computational resources are limited. Copyright © 2012 Elsevier Ltd. All rights reserved.
Schomberg, Dominic; Wang, Anyi; Marshall, Hope; Miranpuri, Gurwattan; Sillay, Karl
2013-04-01
Convection enhanced delivery (CED) is a technique using infusion convection currents to deliver therapeutic agents into targeted regions of the brain. Recently, CED is gaining significant acceptance for use in gene therapy of Parkinson's disease (PD) employing direct infusion into the brain. CED offers advantages in that it targets local areas of the brain, bypasses the blood-brain barrier (BBB), minimizes systemic toxicity of the therapeutics, and allows for delivery of larger molecules that diffusion driven methods cannot achieve. Investigating infusion characteristics such as backflow and morphology is important in developing standard and effective protocols in order to successfully deliver treatments into the brain. Optimizing clinical infusion protocols may reduce backflow, improve final infusion cloud morphology, and maximize infusate penetrance into targeted tissue. The purpose of the current study was to compare metrics during ramped-rate and continuous-rate infusions using two different catheters in order to optimize current infusion protocols. Occasionally, the infusate refluxes proximally up the catheter tip, known as backflow, and minimizing this can potentially reduce undesirable effects in the clinical setting. Traditionally, infusions are performed at a constant rate throughout the entire duration, and backflow is minimized only by slow infusion rates, which increases the time required to deliver the desired amount of infusate. In this study, we investigate the effects of ramping and various infusion rates on backflow and infusion cloud morphology. The independent parameters in the study are: ramping, maximum infusion rate, time between rate changes, and increments of rate changes. Backflow was measured using two methods: i) at the point of pressure stabilization within the catheter, and ii) maximum backflow as shown by video data. Infusion cloud morphology was evaluated based on the height-to-width ratio of each infusion cloud at the end of each experiment. Results were tabulated and statistically analyzed to identify any significant differences between protocols. The experimental results show that CED rampedrate infusion protocols result in smaller backflow distances and more spherical cloud morphologies compared to continuous-rate infusion protocols ending at the same maximum infusion rate. Our results also suggest internal-line pressure measurements can approximate the time-point at which backflow ceases. Our findings indicate that ramping CED infusion protocols can potentially minimize backflow and produce more spherical infusion clouds. However, further research is required to determine the strength of this correlation, especially in relation to maximum infusion rates.
Evolution of the INMARSAT aeronautical system: Service, system, and business considerations
NASA Technical Reports Server (NTRS)
Sengupta, Jay R.
1995-01-01
A market-driven approach was adopted to develop enhancements to the Inmarsat-Aeronautical system, to address the requirements of potential new market segments. An evolutionary approach and well differentiated product/service portfolio was required, to minimize system upgrade costs and market penetration, respectively. The evolved system definition serves to minimize equipment cost/size/mass for short/medium range aircraft, by reducing the antenna gain requirement and relaxing the performance requirements for non safety-related communications. A validation program involving simulation, laboratory tests, over-satellite tests and flight trials is being conducted to confirm the system definition. Extensive market research has been conducted to determine user requirements and to quantify market demand for future Inmarsat Aero-1 AES, using sophisticated computer assisted survey techniques.
Turbulence flight director analysis and preliminary simulation
NASA Technical Reports Server (NTRS)
Johnson, D. E.; Klein, R. E.
1974-01-01
A control column and trottle flight director display system is synthesized for use during flight through severe turbulence. The column system is designed to minimize airspeed excursions without overdriving attitude. The throttle system is designed to augment the airspeed regulation and provide an indication of the trim thrust required for any desired flight path angle. Together they form an energy management system to provide harmonious display indications of current aircraft motions and required corrective action, minimize gust upset tendencies, minimize unsafe aircraft excursions, and maintain satisfactory ride qualities. A preliminary fixed-base piloted simulation verified the analysis and provided a shakedown for a more sophisticated moving-base simulation to be accomplished next. This preliminary simulation utilized a flight scenario concept combining piloting tasks, random turbulence, and discrete gusts to create a high but realistic pilot workload conducive to pilot error and potential upset. The turbulence director (energy management) system significantly reduced pilot workload and minimized unsafe aircraft excursions.
Streby, Henry M.; McAllister, Tara L.; Peterson, Sean M.; Kramer, Gunnar R.; Lehman, Justin A.; Andersen, David E.
2015-01-01
Radio-transmitters and light-level geolocators are currently small enough for use on songbirds weighing <15 g. Various methods are used to attach these markers to larger songbirds, but with small birds it becomes especially important to minimize marker mass and bird handling time. Here, we offer modifications to harness materials and marker preparation for transmitters and geolocators, and we describe deployment methods that can be safely completed in 20–60 s per bird. We describe a 0.5-mm elastic sewing thread harness for radio-transmitters that allows nestlings, fledglings, and adults to be marked with the same harness size and reliably falls off to avoid poststudy effects. We also describe a 0.5-mm jewelry cord harness for geolocators that provides a firm fit for >1 yr. Neither harness type requires plastic or metal tubes, rings, or other attachment fixtures on the marker, nor do they require crimping beads, epoxy, scissors, or tying knots while handling birds. Both harnesses add 0.03 g to the mass of markers for small wood-warblers (Parulidae). This minimal additional mass is offset by trimming transmitter antennas or geolocator connection nodes, resulting in no net mass gain for transmitters and 0.02 g added for geolocators compared with conventional harness methods that add >0.40 g. We and others have used this transmitter attachment method with several small songbird species, with no effects on adult and fledgling behavior and survival. We have used this geolocator attachment method on 9-g wood-warblers with no effects on return rates, return dates, territory fidelity, and body mass. We hope that these improvements to the design and deployment of the leg-loop harness method will enable the safe and successful use of these markers, and eventually GPS and other tags, on similarly small songbirds.