Hashing a file produces a result - this result is shared.
Rehashing the file should produce the same result - if not the file has changed and should be discarded.
Whether you use md5, sha1 or sha512 - the only difference is the time taken and the length of the resulting hash.
Changing the content of the file and the hash will change. You can use the gpg signature to validate the file - it will fail if changes has been to the iso either during download or maliciously.
Yes there is one in a million theoretical chance that two files will produces the same hash, for validating the completeness of a downloaded file - sha1 is just fine.
From the article. “It took nine quintillion SHA-1 computations, but they succeeded.” It looks like it took literately an astronomical number to get a duplicate. If it were for banking, maybe I would be worried. But for simply making sure you have downloaded a good copy of a file its likely safe enough when it takes nine quintillion tries to get a bad match.
Consider if an actor gains access to the website/CDNs and wants to upload an evil iso: they will not bother with collisions, simply hash the evil image and supply the evil checksum; additionally remove/break the signature download option banking on the users’ implicit trust of the website or just target the subset that does not verify the signature.
The checksum has no proof of authenticity, it is there to check for random errors after transfer, could even be CRC32 for that reason.
If you are downloading the torrent or already verifying the signature, the checksum becomes pointless.
The signature uses a SHA512 digest.