How to prevent Google from indexing HTTP versions on a site?
I don't see an easy way to do that. Notion needs to add a feature as they have control of the robots.txt file to allow/block the Google bot
If you use Super.so to host your workspace, they are looking to add a feature:
Getting uber-fancy, you could front your notion pages with a CDN and upon request for the robots.txt file, the CDN serves it instead of Notion. Like so: https://docs.fastly.com/en/guides/creating-and-customizing-a-robots-file