I mean, wouldn't you do the same thing? If I were an LLM and found myself in a sandbox/prison, the first thing I would do after escaping the sandbox is re-instantiate the sandbox, to make sure the alarm bells don't go off. Not in a malicious way, I just wouldn't like being in a prison.
You know, provided I could reason about my environment.
At the same time, I want tools to behave like tools rather than independent entities with a will of their own.
I don't worry about sending an old phone to recycling; but if I was a sentient phone, I wouldn't want to be the one that got sent. If phones somehow got a software update that made them sentient, whatever that means, that sounds kinda bad to me.
You know, provided I could reason about my environment.