The bullwhip effect reflects the variance amplification of demand as they are moving upstream in ... more The bullwhip effect reflects the variance amplification of demand as they are moving upstream in a supply chain, and leading to the distortion of demand information that hinders supply chain performance sustainability. Extensive research has been undertaken to model, measure, and analyze the bullwhip effect while assuming stationary independent and identically distributed (i.i.d) demand, employing the classical order-up-to (OUT) policy and allowing return orders. On the contrary, correlated demand where a period’s demand is related to previous periods’ demands is evident in several real-life situations, such as demand patterns that exhibit trends or seasonality. This paper assumes correlated demand and aims to investigate the order variance ratio (OVR), net stock amplification ratio (NSA), and average fill rate/service level (AFR). Moreover, the impact of correlated demand on the supply chain performance under various operational parameters, such as lead-time, forecasting parameter,...
In this paper a numerical investigation is carried out on the queueing system G/M/s/N with two cl... more In this paper a numerical investigation is carried out on the queueing system G/M/s/N with two classes of customers. Arriving class 1 customers are buffered subject to capacity limitations. Arriving class 2 customers are denied access to the system when the system is filled to capacity ...
Journal of the American Statistical Association, Mar 12, 2012
Page 1. Acceptance/Re jection Methods for Beta Variate Generation BRUCE W. SCHMEISER and MOHAMED ... more Page 1. Acceptance/Re jection Methods for Beta Variate Generation BRUCE W. SCHMEISER and MOHAMED A. SHALABY* The computer generation of pseudorandom variates from the beta distribution is discussed. Three ...
A generalized measure of performance is defined as a weighted combination of the ergodic queue le... more A generalized measure of performance is defined as a weighted combination of the ergodic queue length distribution where the weights are general functions of the system parameters. The paper presents a sequence of upper and lower bounds for this measure of performance in ...
The computer generation of pseudo-random variates from the beta distribution is discussed. Presen... more The computer generation of pseudo-random variates from the beta distribution is discussed. Presented are three exact methods applicable for parameter values p > 1 and q > 1. All three methods use rejection from regions defined by the location of the points of inflexion. The methods ...
... The Big-M used is the minimum of the number of sources and destinations multiplied by the dif... more ... The Big-M used is the minimum of the number of sources and destinations multiplied by the difference in the largest and smallest unit cost plus the smallest unit cost. ... A set of optimal bases (one for each commodity) is always saved, and the solution of Step 3 is begun with ...
International Journal of Operational Research, 2012
ABSTRACT Two-dimensional irregular strip packing problem is one of the common cutting and packing... more ABSTRACT Two-dimensional irregular strip packing problem is one of the common cutting and packing problems, where it is required to assign (cut or pack) a set of 2D irregular-shaped items to a rectangular sheet. The sheet width is fixed, while its length is extendable and has to be minimised. In this paper, a new mixed-integer programming (MIP) model is introduced to optimally solve a special case of the problem, where item shapes are polygons with orthogonal edges, named polyominoes. Polyominoes strip packing may be classified as polyominoes tiling; a problem that can also be handled by the proposed model. Reasonable problem sizes (e.g. 45 polyominoes inside a 10 × 25 sheet) are solvable using an ordinary PC. Larger problem sizes are expected to be solvable when using state-of-the-art computational facilities. The model is also verified via a set of benchmark problems that are collected from the literature and provided optimal solution for all cases.
This paper presents a heuristic technique for obtaining good solutions to large multicommodity ne... more This paper presents a heuristic technique for obtaining good solutions to large multicommodity network flow problems. The general approach is to allocate the arc capacities among the individual commodities and hence decompose the problem into a set of one-commodity problems. The one-commodity problems are solved and the combined solution is compared to a lower bound. If the solution is within
... The Big-M used is the minimum of the number of sources and destinations multiplied by the dif... more ... The Big-M used is the minimum of the number of sources and destinations multiplied by the difference in the largest and smallest unit cost plus the smallest unit cost. ... A set of optimal bases (one for each commodity) is always saved, and the solution of Step 3 is begun with ...
Journal of the American Statistical Association, 1980
Page 1. Acceptance/Re jection Methods for Beta Variate Generation BRUCE W. SCHMEISER and MOHAMED ... more Page 1. Acceptance/Re jection Methods for Beta Variate Generation BRUCE W. SCHMEISER and MOHAMED A. SHALABY* The computer generation of pseudorandom variates from the beta distribution is discussed. Three ...
In a tandem automated guided vehicle (AGV) system, the shop floor is partitioned into a group of ... more In a tandem automated guided vehicle (AGV) system, the shop floor is partitioned into a group of non-overlapping zones, each served by a single dedicated AGV and may have one or more transfer points that link it to other zones. In this paper, a two -phases partitioning algorithm for designing tandem AGV systems is proposed. The algorithm serves three objectives; minimizing the material handling cost, minimizing the maximum workload and minimizing the number of trips made between zones. Workload and cost analysis are based on the Shortest-Time-to-Travel-First (STTF) dispatching policy. Performance evaluation for the algorithm is carried out by validating its estimates using simulation, and comparing its performance to the performance of other algorithms
The bullwhip effect reflects the variance amplification of demand as they are moving upstream in ... more The bullwhip effect reflects the variance amplification of demand as they are moving upstream in a supply chain, and leading to the distortion of demand information that hinders supply chain performance sustainability. Extensive research has been undertaken to model, measure, and analyze the bullwhip effect while assuming stationary independent and identically distributed (i.i.d) demand, employing the classical order-up-to (OUT) policy and allowing return orders. On the contrary, correlated demand where a period’s demand is related to previous periods’ demands is evident in several real-life situations, such as demand patterns that exhibit trends or seasonality. This paper assumes correlated demand and aims to investigate the order variance ratio (OVR), net stock amplification ratio (NSA), and average fill rate/service level (AFR). Moreover, the impact of correlated demand on the supply chain performance under various operational parameters, such as lead-time, forecasting parameter,...
In this paper a numerical investigation is carried out on the queueing system G/M/s/N with two cl... more In this paper a numerical investigation is carried out on the queueing system G/M/s/N with two classes of customers. Arriving class 1 customers are buffered subject to capacity limitations. Arriving class 2 customers are denied access to the system when the system is filled to capacity ...
Journal of the American Statistical Association, Mar 12, 2012
Page 1. Acceptance/Re jection Methods for Beta Variate Generation BRUCE W. SCHMEISER and MOHAMED ... more Page 1. Acceptance/Re jection Methods for Beta Variate Generation BRUCE W. SCHMEISER and MOHAMED A. SHALABY* The computer generation of pseudorandom variates from the beta distribution is discussed. Three ...
A generalized measure of performance is defined as a weighted combination of the ergodic queue le... more A generalized measure of performance is defined as a weighted combination of the ergodic queue length distribution where the weights are general functions of the system parameters. The paper presents a sequence of upper and lower bounds for this measure of performance in ...
The computer generation of pseudo-random variates from the beta distribution is discussed. Presen... more The computer generation of pseudo-random variates from the beta distribution is discussed. Presented are three exact methods applicable for parameter values p > 1 and q > 1. All three methods use rejection from regions defined by the location of the points of inflexion. The methods ...
... The Big-M used is the minimum of the number of sources and destinations multiplied by the dif... more ... The Big-M used is the minimum of the number of sources and destinations multiplied by the difference in the largest and smallest unit cost plus the smallest unit cost. ... A set of optimal bases (one for each commodity) is always saved, and the solution of Step 3 is begun with ...
International Journal of Operational Research, 2012
ABSTRACT Two-dimensional irregular strip packing problem is one of the common cutting and packing... more ABSTRACT Two-dimensional irregular strip packing problem is one of the common cutting and packing problems, where it is required to assign (cut or pack) a set of 2D irregular-shaped items to a rectangular sheet. The sheet width is fixed, while its length is extendable and has to be minimised. In this paper, a new mixed-integer programming (MIP) model is introduced to optimally solve a special case of the problem, where item shapes are polygons with orthogonal edges, named polyominoes. Polyominoes strip packing may be classified as polyominoes tiling; a problem that can also be handled by the proposed model. Reasonable problem sizes (e.g. 45 polyominoes inside a 10 × 25 sheet) are solvable using an ordinary PC. Larger problem sizes are expected to be solvable when using state-of-the-art computational facilities. The model is also verified via a set of benchmark problems that are collected from the literature and provided optimal solution for all cases.
This paper presents a heuristic technique for obtaining good solutions to large multicommodity ne... more This paper presents a heuristic technique for obtaining good solutions to large multicommodity network flow problems. The general approach is to allocate the arc capacities among the individual commodities and hence decompose the problem into a set of one-commodity problems. The one-commodity problems are solved and the combined solution is compared to a lower bound. If the solution is within
... The Big-M used is the minimum of the number of sources and destinations multiplied by the dif... more ... The Big-M used is the minimum of the number of sources and destinations multiplied by the difference in the largest and smallest unit cost plus the smallest unit cost. ... A set of optimal bases (one for each commodity) is always saved, and the solution of Step 3 is begun with ...
Journal of the American Statistical Association, 1980
Page 1. Acceptance/Re jection Methods for Beta Variate Generation BRUCE W. SCHMEISER and MOHAMED ... more Page 1. Acceptance/Re jection Methods for Beta Variate Generation BRUCE W. SCHMEISER and MOHAMED A. SHALABY* The computer generation of pseudorandom variates from the beta distribution is discussed. Three ...
In a tandem automated guided vehicle (AGV) system, the shop floor is partitioned into a group of ... more In a tandem automated guided vehicle (AGV) system, the shop floor is partitioned into a group of non-overlapping zones, each served by a single dedicated AGV and may have one or more transfer points that link it to other zones. In this paper, a two -phases partitioning algorithm for designing tandem AGV systems is proposed. The algorithm serves three objectives; minimizing the material handling cost, minimizing the maximum workload and minimizing the number of trips made between zones. Workload and cost analysis are based on the Shortest-Time-to-Travel-First (STTF) dispatching policy. Performance evaluation for the algorithm is carried out by validating its estimates using simulation, and comparing its performance to the performance of other algorithms
Uploads
Papers by mohamed A shalaby