The question comes from Introduction to Algorithms 3rd Edition, P63, Problem 3-6, where it's introduced as Iterated functions. I rewrite it as follows:
int T(int n){
   for(int count = 0; n > 2 ; ++count)
   {
      n = n/log₂(n); 
   }
   return count;
}
Then give as tight a bound as possible on T(n).
I can make it O(log n) and Ω(log n / log log n), but can it be tighter?
PS: Using Mathematica, I've learned that when n=1*10^3281039, T(n)=500000 
and the same time, T(n)=1.072435*log n/ log log n
and the coefficient declines with n from 1.22943 (n = 2.07126*10^235) to 1.072435 (n = 1*10^3281039).
May this information be helpful.
                        
It looks like the lower bound is pretty good, so I tried to proof that the upper bound is
O(log n / log log n). But let me first explain the other bounds (just for a better understanding).TL;DR
T(n)is inΘ(log n / log log n).T(n) is in
O(log n)This can be seen by modifying
n := n/log₂nton := n/2.It needs
O(log₂ n)steps untiln ≤ 2holds.T(n) is in
Ω(log n / log log n)This can be seen by modifying
n := n/log₂(n)ton := n/m, wheremis the initial value oflog n.Solving the equation
n / (log n)x < 2forxleads us tolog n - x log log n < log 2 ⇔ log n - log 2 < x log log n ⇔ (log n - log 2) / log log n < x ⇒ x ∈ Ω(log n / log log n)Improving the upper bound:
O(log n) → O(log n / log log n)Now let us try to improve the upper bound. Instead of dividing
nby a fixed constant (namely2in the above proof) we dividenas long by the initial value oflog(n)/2as the current value oflog(n)is bigger. To be more clearer have a look at the modified code:The complexity of the function
T₂is clearly an upper bound for the functionT, sincelog₂(n_old)/2 < log₂(n)holds for the whole time.Now we need to know how many times we divide by each
1/2⋅log(n_old):n / (log(sqrt(n)))x ≤ sqrt(n) ⇔ n / sqrt(n) ≤ log(sqrt(n))x ⇔ log(sqrt(n)) ≤ x log(log(sqrt(n))) ⇔ log(sqrt(n)) / log(log(sqrt(n))) ≤ xSo we get the recurrence formula
T₂(n) = T(sqrt(n)) + O(log(sqrt(n)) / log(log(sqrt(n)))).Now we need to know how often this formula has to be expanded until
n < 2holds.n2-x < 2 ⇔ 2-x⋅log n < log 2 ⇔ -x log 2 + log log n < log 2 ⇔ log log n < log 2 + x log 2 ⇔ log log n < (x + 1) log 2So we need to expand the formula about
log log ntimes.Now it gets a little bit harder. (Have also a look at the Mike_Dog's answer)
T₂(n) = T(sqrt(n)) + log(sqrt(n)) / log(log(sqrt(n))) = Σk=1,...,log log n - 1 2-k⋅log(n) / log(2-k⋅log n)) = log(n) ⋅ Σk=1,...,log log n - 1 2-k / (-k + log log n)) (1) = log(n) ⋅ Σk=1,...,log log n - 1 2k - log log n / k = log(n) ⋅ Σk=1,...,log log n - 1 2k ⋅ 2- log log n / k = log(n) ⋅ Σk=1,...,log log n - 1 2k / (k ⋅ log n) = Σk=1,...,log log n - 1 2k / kIn the line marked with (1) I reordered the sum.
So, at the end we "only" have to calculate
Σk=1,...,t 2k / kfort = log log n - 1. At this point Maple solves this towhere
Iis the imaginary unit andLerchPhiis the Lerch transcendent. Since the result for the sum above is a real number for all relevant cases, we can just ignore all imaginary parts. The Lerch transcendentLerchPhi(2,1,t)seems to be inO(-1/t), but I'm not 100% sure about it. Maybe someone will prove this.Finally this results in
All together we have
T(n) ∈ Ω(log n / log log n)andT(n) ∈ O(log n/ log log n),so
T(n) ∈ Θ(log n/ log log n)holds. This result is also supported by your example data.I hope this is understandable and it helps a little.