Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> Tell me you don’t use Copilot without telling me about it.

You don't accept arguments against the use of Copilot from people unless they... use it?

That's a nifty way to ignore any and all criticism of Copilot, or indeed any discussion about any ethical issue ever.



I believe the argument is that you shouldn't accept arguments against the use of copilot from people unless they have tried to use it. In a realistic context. That seems reasonable to me. It's the bare minimum to make an informed opinion. I think the wording was perhaps poor, but I think your interpretation is a little reductive/disingenuous.


I don't see how using copilot make the copyright question any less serious.

Because it's useful, then it's not a problem?

Well, it's also useful to send our non recyclable trash to 3rd world countries and every 1st World country should try it. It will definitely make the consequences less serious if everyone does it.

Not apples to apples but I guess you get the picture.


As someone who has used copilot since the early beta days, what I think people are saying is that nobody uses Copilot to generate full functions like this. It's more of an intelligent auto complete. It's fantastic for repetitive autocomplete where certain things have to be changed, and for quickly getting out boilerplate code. You can put a list in a comment along with a format and generate data structures quickly and easily. You can solve small problems quickly, allowing you to focus on the bigger picture.

It's sort of like a power tool — sure, you could use a screwdriver, but a drill with a screwdriver attachment will be quicker. Hammers are good, nail guns are quicker. You'd never expect someone to use a drill with sd attachment if they'd never used a screwdriver before.

There are for sure things to be improved, such as the recent post on how you could put in a very specific seed and get out a specific function that it shouldn't. The answer here isn't to shut down the project, at a net loss for everyone, but to find ways to improve it.

As others have said, with Copilot gone and the new demand created, the vacuum will bring in community projects that will happily scrape every public repository they can get their hands on.


Now I understand what you mean, still pretty crappy because those minor auto complete strings only exist because Microsoft used code without permission and/or without crediting the original owners and/or breaking licenses from the original corpus.

I use copilot everyday, I love it, but it still leaves a bad taste in my mouth knowing that people out there worked really hard on their code and harder on building OSS licences just for Microsoft to throw all that out of the window.

Feels like licenses don't matter anymore. My own code doesn't matter much, but it's about principals dude, licenses are there and they should be respected, if not, then it's just anarchy, and we all know anarchy only works in very specific scenarios, Microsoft is not apart of any of those scenarios.


I believe the argument being made is that in _actual, real-world_ use of copilot, no copyright infringement happens. In order to make an informed decision as to whether you agree with that, you can try copilot to reach an informed conclusion. There is no cost to try it. Unlike your example, where "trying" has an immediate cost -- which is why that example doesn't make too much sense here.


Yep know it's much more clear, thanks for clarifying.

See your sister comment's child for my reply.


Because its a net good to the world its not a problem. It the benefit is orders of magnitude greater than the harm then its good.


It’s a net harm for the programmers whose code is being willfully plagiarized.

It’s a net boon for Microsoft in their efforts to rule the world.

It’s a net loss for society and ethics.

Open up copilot code, Microsoft, if you are so sure that everyone must wear transparent underwear let’s see you wearing some. Train copilot on windows 11 code. It’s not public domain.

Truth matters. Lies matter


Expand on the unethical part. So people published code that could be referenced and copied on GitHub. There was no ethical problem, the world, society were happy.

Github make a convenient way to search and contextualise this publicly available code and paste it into your code (adjusting local scope, format, language along the way). Suddenly we have crossed an ethical line!?

Which ethical line? Are you pretending people never copy and pasted open source code before copilot? Are you pretending open source code never copy and pasted other open source code? That we were in an ethically pure world until copilot came along?


> So people published code that could be referenced and copied on GitHub. There was no ethical problem, the world, society were happy.

This code has different licenses. You can't just copy code randomly without checking license first.

Copilot serves it stripped of the license to unaware users. Even if copilot user wants only to reuse code licensed in a way that allows it copilot will serve him code from restrictive licenses without him being aware.


You can just copy and paste code without checking the license. People do it all the time.

GitHub doesn’t force you to accept the license in the repository before showing you the code.


> It’s a net harm for the programmers whose code is being willfully plagiarized.

What's the harm, specifically?

Say it copies that snippet of workflow scheduling code I made at work yesterday or the greasemonkey script I made in my own time.

How is my life worse?


> I believe the argument is that you shouldn't accept arguments against the use of copilot from people unless they have tried to use it.

I disagree, and this does not hold up generally: We can, and should, argue things we have not tried or experienced, like heroin and murder. What makes it so that this has to be tried?

> It's the bare minimum to make an informed opinion.

Only if the usefulness is what is in question. But it is not.


> We can, and should, argue things we have not tried or experienced, like heroin and murder. What makes it so that this has to be tried?

It's not that you absolutely have to have experience with something, but you'd be foolish to discount the input of people who do. In debates about drug policy I try to be polite to people with zero first hand experience, but their contributions are rarely of interest. Murder is a bit more abstract insofar as anyone who has fully experienced it by definition didn't survive to testify, but I give a lot more weight to the views of people that have first-hand knowledge of violence and crime.

It's not that you shouldn't weigh in on a topic without first hand experience, but that it's a good idea to specify the scope of your understanding, or frame uncertainties as open questions rather than assumptions.


Correct, it doesn't hold up generally. But it doesn't need to. It holds up here. We do not try things when there is exceptional risk or cost in the trying. Here there is no cost to trying, so it does not make sense not to try.

I believe the argument being made is that in _actual, real-world_ use of copilot, no copyright infringement happens. So it's not just about usefulness.


> I believe the argument being made is that in _actual, real-world_ use of copilot, no copyright infringement happens. So it's not just about usefulness.

How would you know though? The burden of proof is on Copilot. Especially now that it has been shown to spit out copyrighted code.


You’re right, trying Copilot is equivalent to committing murder.

/s


>> Tell me you don’t use Copilot without telling me about it.

> You don't accept arguments against the use of Copilot from people unless they... use it?

> That's a nifty way to ignore any and all criticism of Copilot, or indeed any discussion about any ethical issue ever.

"I only listen to people who agree with me, but to make that sound legitimate, I have a somewhat indirect way of saying so."


They should at least try to understand how it’s actually used, not imagining how it’s simply used to steal their largely replaceable code.


It doesn't matter how it's used. Do you think Microsoft would be happy with someone training a model on Windows source code, as long as they didn't use it to reproduce the code?


If Microsoft were confident Copilot doesn't produce infringing code, they would have included the Windows and Office codebases in the training data. I wonder what will come out of discovery


You think MS‘ code quality is high enough to train an AI on?


Do you think they audited every open source code base that was used in training for quality?


“Their largely replaceable code”

Smells like: “ I stole this lousy apple that wasn’t any good” Then why did you steal it?

Put your money where your mouth is, Microsoft, train copilot on your own code!!!

Don’t wanna train it with windows 11 code? Prefer to hijack others projects and use their for your needs and then pretend thst insulting others and calling their code worthless will get you off the hook????

Backfire


> Smells like: “ I stole this lousy apple that wasn’t any good” Then why did you steal it?

The lousy code trained copilot in what a switch statement looks like so it can autocomplete mine for me


On a different website I argued with a microsoft employee who said that copilot is great and so on and would not discuss unless I tried it.

I tried telling that it requires a credit card number to try it but he didn't believe me… I guess the thought that non-microsoft employees have to pay for microsoft stuff didn't occur.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: