That assumes you don’t value your time spent dealing with troubles that come.
Like the other person said, it’s fine if you don’t, but for me it’s worth a little upfront cost to have to deal with less ordering new drives, putting the drive in the server, monitor rebuilding of the array, ect…
None of that is an excuse for lack of proper backups. Because even new drives can fail catastrophically.
That’s not how fractions and math work though.
That’s not what Planck length is. It’s the minimum resolvable accuracy not measurement. Meaning we can’t prove something was somewhere specific beyond the Planck length. Not that it’s the building size of the universe.
it is a common misconception that it is the inherent “pixel size” or smallest possible length of the universe.[1] If a length smaller than this is used in any measurement, then it has a chance of being wrong due to quantum uncertainty
If you want my credentials, the second book is deriving the hydrogen atom.
deleted by creator
Being continuous is not actually a requirement of being real.
That’s not what I said?
They’re “stable” energy states. That’s all.
A real number is the set of both rational and irrational numbers. Nothing about continuous anything.
deleted by creator
They don’t make “discrete jumps” as in teleportation. They exist stable in discrete energy levels, but that doesn’t imply things don’t move continuously.
Mmmmmm don’t know about that.
The Planck length is the minimum resolvable accuracy of the universe. That doesn’t mean it’s a building block like the electron is.
That assumes dev resources are limitless. And for a company the size of proton that’s certainly not true.
They can only have X amount of devs. So how they allocate them says a lot.
Also given that most complaints I’ve seen at the top are about specific missing features for ages, I think it’s safe they’re putting their eggs into too many baskets.