Nvidia can't make GPUs fast enough. I doubt 10xing training and/or inference efficiency would result in a decrease in demand. I would be surprised if it didn't instead increase demand. Mind you, Nvidia is pushing hard on TensorRT which optimizes models at inference time and results in major increases in throughput (not 10x though lol).
But if things get too efficient for individual users, you won't need an Nvidia GPU anymore. People will use cheaper hardware instead. I'm looking forward to running good models at decent speed on a low-end CPU or whatever crappy GPU is in my phone.
I had the same thought this morning and was debating selling my nvda stock when I saw this - feels like they are well-positioned right now, as with crypto a few years ago, but if there were an efficiency breakthrough that allowed commodity CPUs to do the inference instead, this advantage could vanish quickly.
edit: but you are right for the AI companies not open sourcing their models it's an advantage to have it when others don't