(no title)
graycat | 4 months ago
Apparently part of the algorithm is based on the size of the storage being requested.
Hmm. So, we have historical data of storage requests and for each (i) the size of the request, (ii) how long until the storage is freed, (iii) etc. ....
Guessing about a bizarre case: It might be that on Monday many storage requests of certain small sizes have lifetime just a little longer than the decision to move the request to another category, i.e., the moving effort was inefficient, wasted.
So, in simple terms, for an optimization, for each of the variables have both in the history and real time, make the variable values discrete, altogether may have for some positive integer n a few thousand different n-tuples of variable values; then for each n-tuple pick the best decisions (policies, etc.). Uh, unless this idea has already been tried.
No comments yet.