I'd like to see an A/B test between the 2 options. It sounds plausible that a less-sophisticated buyer might see the Visa logo light up after they've typed the first digit of their card and get confused ("WHAT WITCHCRAFT IS THIS").
I don't have published version of data, but I can confirm from running a checkout flow that's processed $100M that fewer fields = more conversions. If you light up the logo people understand it when done properly.
It helps that more savvy ecommerce sites are moving to this model so more and more users are familiar with the experience.
Conversion rate isn't the only thing you optimize for. I helped a company reduce its fraud/chargeback rate by asking for the type of card without error correction. A type mismatch was one of a dozen or so scoring items that, if over a specific threshold, tripped a second level of verification.
Unfortunately I don't have a linkable source for this, but the Democratic fundraising platform ActBlue gave a talk on A/B testing donation forms and it turns out that when they removed the "select card type" field, donations went -way- down. You'd think that the Law of Forms would apply (fewer fields = higher conversions), but it doesn't. Apparently people assumed that there was something wrong with the form and therefore didn't trust it (when they removed that field, they also had a huge uptick in people submitting reports of things wrong with the form).
I'm hoping someone from ActBlue's dev team sees this and can jump in. They run a lot of fascinating tests on credit card forms.