UMD Analysis Qualifying Exam/Jan11 Real

Solution
Since $$f$$ is (absolutely) continuous on [0,1] with $$f>0$$ then there exists some $$00$$ there exists some $$\delta>0$$ such that for any finite collection of disjoint intervals $$I_k=(x_k,y_k),k=1,...,n$$ such that if $$\sum_{k=1}^n|y_k-x_k|<\delta$$ then $$\sum_{k=1}^n|f(y_k)-f(x_k)|<\epsilon m^2$$.

Then for any such collection of intervals described above, we have $$\sum_{k=1}^n |\frac{1}{f(y_k)}-\frac{1}{f(x_k)} =\sum_{k=1}^n |\frac{f(x_k)-f(y_k)}{f(y_k)f(x_k)}| \leq \sum_{k=1}^n\frac{1}{m^2}|f(y_k)-f(x_k)|<\epsilon $$.