Hacker News new | past | comments | ask | show | jobs | submit login

The reddit post feels like engagement bait to me.

Why would you ask the community a question like "how to source control" when you've been working with (presumably) a programming genius LLM that could provide the most personally tailored path for baby's first git experience? Even if you don't know that "git" is a thing, you could ask questions as if you were a golden retriever and the model would still inevitably recommend git in the first turn of conversation.

Is it really the case that a person who has the ability to use a compiler, IDE, LLM, web browser, reddit, etc., somehow simultaneously lacks the ability to frame basic-ass questions about the very mission they set out on? If stuff like this is not manufactured, then we should all walk away feeling pretty fantastic about our future job prospects.




If you start from scratch trying to build an ideal system to program computers, you always converge on the time tested tooling that we have now. Code, compilers, interpreters, versioning, etc.

People think "this is hard, I'll re-invent it in an easier way" and end up with a half-assed version of the tooling we've honed over the decades.


> People think "this is hard, I'll re-invent it in an easier way" and end up with a half-assed version of the tooling we've honed over the decades.

This is a win in the long run because the occassional and successful thought people labor over sometimes is a better way.


Agreed. We wouldn't have distributed version control, container environments, profilers, etc without people trying to to make programming better. But those are all based on improving single aspects (better versioning, repeatability, debug, etc).

When the goal is "re-invent programming to make it easier" all you get is a hodgepodge of half-ass solutions like GP said. Enhancing traditional focused workflows seems a lot more interesting to me than "coding assistant".

Hopefully AI tooling will continue to evolve. I don't see how you get around the reliability issues with this iteration of AI (GPT+RLHF+RAG, etc). Transfer learning is still abysmal.


The account is a throwaway but based on its short posting history and its replies, I don't have reason to believe it's a troll:

https://www.reddit.com/r/cursor/comments/1inoryp/comment/mdr...

> I'm not a dev or engineers at all (just a geek working in Finance)

This fits my experience of teaching very intelligent students how to code; if you're an experienced programmer, you simply cannot fathom the kinds of assumptions beginners will make due to gaps in yet-to-be foundational knowledge. I remember having to tell students to mindful when searching Stack Overflow for help, because of how something as simple as an error from Requests (e.g. while doing web scraping) could lead them down a rabbit hole of "solutions" such as completely uninstalling their Python for a different/older version of Python.


They were using Cursor, not a general LLM, and were asking their fellow Cursor users how they deal with the risk of Cursor destroying the code base.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: