Importance sampling in rate-sharing networks
We consider a network supporting elastic traffic, where the service capacity is shared among the various classes according to an alpha-fair sharing policy. Assuming Poisson arrivals and exponentially distributed service requirements for each class, the dynamics of the user population may be described by a Markov process. We focus on the probability that, given the network is in some state n0 at time 0, the network is in some set of states A (not containing n0) at time T. In particular, we assume that the underlying event is rare, i.e., the probability of interest is small. As in general no explicit expressions are known for this probability, an attractive approach may be to resort to Monte-Carlo (MC) simulation. However, due to the rarity of the event under consideration, MC simulation is infeasible. A natural approach to speed up the simulation is to use Importance Sampling (IS). We present an IS algorithm to accelerate the simulation that is based on large deviations results. With extensive simulation experiments we assess the performance of the algorithm; under rather general conditions a considerable speed-up is achieved.