Loading
Basically, I’ve been using the different probability generation spaces that different AI models ( represents as a way to figure out convergence/divergence for any ideas I might have. I’m sort of abstracting the notion of Moiré interference to see if any patterns pop out semantically, and building up a stochastic picture of whatever I’m considering. There’s a weird back and forth where I’ll range over ideas for a while before landing on something solid. It weirdly feels like I’m looking through a microscope or telescope.
submitted by /u/Novel_Nothing4957
[link] [comments]