As a holder of dozens of patents, many of them on processes derived from algorithms, I fully understand (as well as anyone can) the implications of current case law. In order to patent an algorithm, you have to re-cast it as an operation to be performed on a general purpose computer. As long as you do that, you have the patent. I fully agree that there's nothing to stop you from calculating GIF's with pencil and paper, or on an abacus, or in your head -- is that a relevant distinction, or merely an academic one?
Additionally, the algorithm has to be non-trivial, which I take to mean multi-step, for it to be patentable when it's transformed into an operation performed on a general-purpose computer. Simply multiplying by a constant, even when done on a general purpose computer, is not patentable.