I've been starting work on a long rant title "Counting is Harder Than You Think". In general, I think most people think counting is one of the easiest things for computers to do because people learn counting in elementary school and just forever associate it with "easy". (Someone's never asked the elementary school teacher's opinion of that.)
"How hard can it be, it's just a Select Count() in SQL!" Uh, that Count() is possibly doing a ton of work in CPU/IO that the server could be doing for other things, and sure an Index might speed that up, but you can't really index an Index and eventually you get right back to where you can't afford the CPU/IO time.
People just assume computers have exact numbers at all times. Some of that is just a problem of bad UX design ("why are we showing a meaningless estimate number like 1,492,631 and not 'about a Million things to do'?"), but so much of it just seems to be that people think counting is easy.
"How hard can it be, it's just a Select Count() in SQL!" Uh, that Count() is possibly doing a ton of work in CPU/IO that the server could be doing for other things, and sure an Index might speed that up, but you can't really index an Index and eventually you get right back to where you can't afford the CPU/IO time.
People just assume computers have exact numbers at all times. Some of that is just a problem of bad UX design ("why are we showing a meaningless estimate number like 1,492,631 and not 'about a Million things to do'?"), but so much of it just seems to be that people think counting is easy.