Figure 3 and 4 illustrate the improvement factors of 1-level and 2-level cache respectively for different cache hit
These algorithms mostly combine LFU and LRU to get better results in the cache hit
The cache hit
ratio was measured with and without prefetching.
This coordination complicates cooperative caching; not only must clients coordinate to provide traditional caching functions such as block lookup, replacement, and consistency, but they must also coordinate to manage the size of the cooperative cache so that local cache hit
rates are not affected.
The prefetches issued during the second iteration result in cache hits
for the b and c references, provided that a prefetch distance of one is sufficient.
Hence, the access of R at i enjoys a definite cache hit
This effect was not detected in Wilkinson and Neimat  because that study used a probabilistic cache model that assumed that cache hit
probabilities were independent of cache size.
Then we carry out computer-based simulation of the content caching and distribution process in Microsoft Visual C++ by using the solution of Lingo and the performance in terms of cache hit
rate, content delivery latency and traffic volume is evaluated.
If there is a cache hit
, then the client plays the initial segments of the video and then proxy formulates a streaming schedule to stream the remaining video from the streaming server.
The averaged class cache hit
ratios approached 54% with a cache size of 64k, revealing the effective limits of the prediction mechanism as well as demonstrating the effective parallelization of class delivery with program execution.
The center design of our strategy includes reducing the query latency, improving the cache hit
ratio, minimizing the client disconnection overheads, better utilization of wireless channel, and conserving the client energy.
In this case, the machine "notifies" software that the informing operation was a cache hit
by squashing the instruction in the issue slot following that informing operation.