AI has a "Dark Matter" problem.
— How To AI (@HowToAI_) May 14, 2026
And it’s the reason why even the smartest models still hallucinate.
Most scientific knowledge is stored in a "compressed" form. We see the final conclusion, the textbook formula, the Wikipedia claim, the polished result.
But the actual… pic.twitter.com/vircB73bI4
From the middle of the tweet:
They built a search engine that doesn't look for keywords. It performs "Inverse Knowledge Search."
If you query a concept, it doesn't give you a summary. It retrieves the diverse, verified reasoning paths from physics, chemistry, and biology that all culminate in that single point.
The final lines in the tweet:
We’ve spent years training AI to mimic how humans talk about science.
But talking about science is just repeating conclusions.
This paper proves that the future of intelligence is about reconstructing the logic that built it in the first place.
