If you're saying that it's the uncertainty in the initial measurement, then we're in agreement. If the initial measurement were perfect, the only source of error would be the finite timestep. N-body simulation itself is deterministic, and so the only source of randomness is our uncertainty about the object's true mass, size, shape, position, velocity, etc.
The N-body _reality_ _might_ be deterministic. The N-body simulation using digital computers will technically still introduce errors because of the time steps even if you had perfect knowledge of initial conditions.
The errors are deterministic. Determinism has nothing to do with the existence of errors, it's about uncertainty. They're different things. A system that is deterministic will produce the same results every time given the same initial conditions. If there are numerical errors, they will be identical for each run. A non-deterministic system will give you different results every time given the same initial conditions, with some variance. You can still have numerical errors in such a system.
Ironically, reality probably isn't deterministic. It definitely isn't at small scales (e.g. radioactive decay). If it's non-deterministic at a macro scale, the effect is small enough that we don't see it.
That’s the point, reality isn’t deterministic,so you can’t really use deterministic math to describe it. That’s just an approximation, regardless of errors in the simulation. That’s also why you run Montecarlo simulations, not to even out simulation errors, but to compute as many probable outcomes as possible and then have a probability distribution that represents your best bet at guessing the non deterministic reality that you are trying to predict. If you “run” reality twice your not gonna get the same result