I often see the sentiment, essentially, a method is much better because it uses only matvecs (rather than factorize and solve). This always confuses me, because the game for numerical linear algebra folks is inventing funky preconditioners to fit their problem, right? Unless people are subtly saying “our new method converges incredibly quickly…”
There are tons of different games to play. Designing a preconditioner that's specific to the problem being solved can help you beat incomplete LU, often by a substantial (even asymptotic) margin.
If you have a problem that's small enough to factorize and solve, that's great. That probably is the best approach. This doesn't scale in parallel. For really big problems, iterative methods are the only game in town.
It's all about knowing the range of methods that are applicable to your problem and the regimes in which they operate best. There's no one-size-fits-all solution.
I agree, I was just using ilu as sort of a well known example. Also, because ILU0 has no fill-in, it should (IMO) be considered the “baseline” in the sense that not trying it should be justified somehow.
I wonder if they put any matrices in the suitesparse collection, or anything like that. It would be a nice fun weekend project to just play around with them.
ILU0 is practically free, right?