Over the last few years, the use of large-scale multimedia data applications has been growing tremendously, and this growth is not likely to slow down in the near future. Many multimedia applications operate in a real-time environment (e.g., surveillance cameras, iris scans), which must meet strict time constraints, i.e. to analyze video frames at the same rate as a camera produces them. To meet this requirement, Grid computing is rapidly becoming indispensable. However, the variabilities of the software and the hardware in grid environment cause the strong burstiness in the transmission delay of video frames. Because the burstiness is unknown beforehand, it is difficult to determine the right sending moments of video frames. If the time interval between sending two sequential frames is too large, then the service utilization may be low. If use large buffer to guarantee the service utilization, then video frames may be outof- date because of the long waiting time at buffer in the server side. This problem is referred to as “Just-in-time” problem. To solve this problem, it is essential to determine the right sending moments of video frames, properly dealing with the trade-off between the service utilization and the “up-to-date” of video frames. Motivated by this, in this paper we develop an adaptive control method that react to the continuously changing circumstances in grid system so as to obtain the highest service utilization on the one hand and to keep the video frame up-to-date on the other hand. Extensive experimental validation in our DAS-3 testbed and the trace-driven simulation show that our method is indeed highly effective.
,
IEEE/ACM International Symposium of Cluster Computing and the Grid
Stochastics

van der Mei, R., Seinstra, F. J., Koole, G., Yang, R., Roubos, D., & Bal, H. (2008). Performance model for “Just-in-Time” problems in real-time multimedia applications.