In this paper, we study a queue fed by a large number $n$ of independent discrete-time Gaussian processes with stationary increments. We consider the {it many sources} asymptotic regime, i.e., the buffer exceedance threshold $B$ and the service capacity $C$ are scaled by the number of sources ($Bequiv nb$ and $Cequiv nc$). We discuss three methods for simulating the steady-state probability that the buffer threshold is exceeded: the single twist method (suggested by large deviation theory), the cut-and-twist method (simulating timeslot by timeslot), and the sequential twist method (simulating source by source). The asymptotic efficiency of these three methods is investigated as $n oinfty$: for instance, a necessary and sufficient condition is derived for the efficiency of the method based on a single exponential twist. It turns out that this method is asymptotically inefficient in practice, but the other two methods are asymptotically efficient. We evaluate the three methods by performing a simulation study.

, , ,
CWI. Probability, Networks and Algorithms [PNA]

Dieker, T., & Mandjes, M. (2004). Fast simulation of overflow probabilities in a queue with Gaussian input. CWI. Probability, Networks and Algorithms [PNA]. CWI.