The highest frequency you can practically send long distances over coax is around 1-2GHz. When I had a cable modem the highest I saw was 900MHz. Therefore, regardless of what encoding scheme you use you will never get more then a couple GHz of bandwidth.
Visible light starts at 400THz so right off the bat you get 10k times more more headroom before physics becomes the limiting factor.
I was thinking somehow that the response / recovery time of the silicon detectors to react to the laser pulses would be a limiting factor, is that at all valid?
Like, you cannot blink faster than <x> nanoseconds and get a CCD(?) to see it properly. (I'm sure it's not like a CCD with readout, etc. but whatever mechanism is the correct one, is there some natural minimum read time?)
Visible light starts at 400THz so right off the bat you get 10k times more more headroom before physics becomes the limiting factor.