Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This is the reason one should always ask the LLM to create scripts to complete the task. Asking it to do things is fine, but as you stated you will forget. If you ask the LLM to do something, but always using a script first, and if you ask: 'Create a well documented shell script to <your question here>', you will have auto documentation. One could go one step further and ask it to create a documented terraform/ansible/whatever tooling setup you prefer.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: