Hacker News new | past | comments | ask | show | jobs | submit login

> *everything* is transient for scientists who optimize for proof-of-concept and publish-and-forget-it paper writing.*

Which itself is a huge problem.

Happily this mindset is changing, at least in some scientific all fields. For example, in particle physics proposals a document ("data management plan") much be written describing how that unconscionable attitude will not be taken with the experiment's data and software. That said, this transient mindset and derision of real software skills is still fairly prevalent in this field.




Which itself is a huge problem

More "nature of the beast" in my opinion. Science measures itself by how many alluring women it can date; engineering, by how long it can keep the wife happy.


Except for the fact that some experiments are taking decades themselves or are one part of a long progression of related experiments. So continuity of software and data through generations of students, postdocs and even professor types is needed.

Even for short-lived experiments reproducibility is important. So much of today's experiments ultimately rely on complex stacks of software to get their results and on data which humans can afford to acquire only once. Preserving both is necessary for future re-validation or reuse.

This problem should not be excused.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: