person594

joined 1 year ago
[–] person594@feddit.de 3 points 11 months ago

Kind of a tangent at this point, but there is a very good reason that that couldn't be the case: according to the shell theorem , nowhere in the interior of a spherical shell of matter experiences any net gravitational force -- wherever you are inside the sphere, the forces counteract exactly.

Otherwise, though, the metric expansion of space is different from typical movement, and it isn't correct to say that things or being pushed or pulled. Rather, the distance between every pair of points in the universe increases over time, with that rate of increase proportional to the points' distance. Importantly, for very distant points, the distance can increse faster than the speed of light, which would be disallowed by any model which describes the expansion in terms of objects moving in a traditional sense.

[–] person594@feddit.de 4 points 11 months ago

That isn't really the case; while many neural network implementations make nondeterministic optimizations, floating point arithmetic is in principle entirely deterministic, and it isn't too hard to get a neural network to run deterministically if needed. They are perfectly applicable for lossless compression, which is what is done in this article.

[–] person594@feddit.de 0 points 1 year ago (1 children)

Let's just outlaw racism too while we're at it!

[–] person594@feddit.de 3 points 1 year ago