Computational redundancy of an image represents the amount of computations that can be skipped to improve performance. In order to calculate and exploit the computational redundancy of an image, a similarity measure is required to identify similar neighborhoods of pixels in the image. In this paper, we present two similarity measures: a position-invariant histogram-based measure and a rotation-invariant multiresolutional histogrambased measure. We demonstrate that by using the position-invariant and rotation-invariant similarity measures, on average, the computational redundancy of natural images increases by 34% and 28%, respectively, in comparison to the basic similarity measure. The increase in computational redundancy can lead to further performance improvement. For a case study, the average increase in actual speedup is 211% and 35% for position-invariant and rotation-invariant similarity measures, respectively.