They don't / have a hard time indexing/finding my website. Just because I don't follow the predatory rules that many of them (not all) enforce.
Time to exploit these systems and force them to find my stuff. I hosted my website in the past with github.pages. It worked but kinda sucked. My page got found all the time and indexed really well. My assumption is that they have an internal system so that pages are found / submitted to search engines.
You can still find my old website (archive) with all it's broken links to this day.
So... I came up with a system. I will rehost my website on github-pages BUT will add js-code to my webpage that will check what the viewer is and redirect them to the real page. I don't normally like this but this seems to be the only way. And yes, I could just double host, but f*ck that.
This solution should also link all my old website links on the web to my new site... or rather redirect them lol.