I stuck a 512GB SSD in my (8GB RAM) 2010 MBP maybe a year(?) ago. The general speed boost was incredible, of course, but the thing that made the biggest impression was definitely seeing several GB paged out, with little performance degredation.
I have had some weird experiences when pushing it hard, though. At one point when I got up to ~15-20GB swapped out, I encountered a kernel panic. I wasn't doing anything fundamentally weird, just loading a large amount of data into "RAM".
I should try it again and see what happens. I wonder how hard Apple stresses this functionality in QA?
All of my kernel panics and spinners of death were due to running some Java IDE or XML tool loading up 100MB files and the like (sometimes in-memory parse trees for those sized files can push the heap to multi GB levels).
Were you using a JVM when you ran into the system glitches?
No, the memory-hungry process I was running was reasonably simple C that I'd written myself. It was semi-contrived, reflecting the extreme of an internal use case, but fundamentally I was just moving around/massaging bytes in big, boringly-malloc'd blocks of memory.
I can't swear that there were no Java processes running on the machine at the time, though. And I often have a Windows or Linux VM running in the background.
I stuck an intel SSD in my 2011 mbp, as well as 16gb of ram (it was really really cheap..).
I disabled the pagefile completely, and things are as snappy as ever - I can't think of an application I use that would use up so much ram that I need to page to disk.
I have had some weird experiences when pushing it hard, though. At one point when I got up to ~15-20GB swapped out, I encountered a kernel panic. I wasn't doing anything fundamentally weird, just loading a large amount of data into "RAM".
I should try it again and see what happens. I wonder how hard Apple stresses this functionality in QA?