Hacker News new | past | comments | ask | show | jobs | submit login

> I suppose you could simulate having a large population by maintaining a set of snapshots, but it still seems like not the same thing as a classic evolutionary technique.

I've got an experiment using this exact technique running on a machine right now, but I wouldn't argue that it is a good alternative for an actual population.

The reason I attempted this is because snapshotting a candidate is an unbelievably cheap operation. You don't have to allocate memory once the snapshot buffer is in place. It's just a bunch of low-level memory block copy operations. I can achieve several million iterations per candidate per hour with this hack, but only at the expense of maintaining a large population.

My conclusions so far are that it is interesting to use a snapshot/restore approach, but it seems to only take you into local minima faster than everything else. Real population diversity is an information theory thing that will implicate memory bandwidth, et. al., and slow things down by 3-4 OOM from where we'd prefer to be.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: