Relativized obliviousness is introduced to capture the intuitive idea, that some problems allow fastest computations which are more oblivious than do other problems, without any of such computations being oblivious in the standard sense. It is shown that each increase in the obliviousness of an algorithm (in several different well-defined meanings), for the solution of some problems, may necessarily require an increase in computation time from T(n) steps to T(n) log T(n) steps. There is, however, no problem for which a total oblivious algorithm requires more than order T(n) log T(n) steps, if the best algorithm for it runs in T(n) steps. We use on-line Turing machines as model of computation.

Additional Metadata
Persistent URL dx.doi.org/10.1007/BFb0022541
Conference International Symposium on Mathematical Foundations of Computer Science
Citation
Vitányi, P.M.B. (1980). Relativized obliviousness. In Lecture Notes in Computer Science/Lecture Notes in Artificial Intelligence (pp. 665–672). doi:10.1007/BFb0022541