Danger. Danger. Will Robinson. THis is from Google's Q3 conf call last PM:
"early testing results from AI Overviews suggest Gen-AI Search features are leading to higher user satisfaction and increased usage, with the highest engagement from younger users aged 18 to 24 when they use search with AI overviews."
So, here's the deal, and don't just go back to bed and cover your face with the pillow, People aged 18 - 24 believe almost everything they read on the www, and they believe it MUCH MORE SO if it comes from an AI model ... which isn't so smart to begin with, but also has been "trained" in certain subjective ways. Fun. Fun. Fun.
obtw, "Google has reduced the cost of AI queries by 90%+ in the last 18 months" so all-AI-all-the-time-me
I mean, ideally people would learn in school to not believe everything you read, and I was taught that pre Internet. That said, there also seems to be a lot of anecdotes that people just overly trust "computers". Probably because they learned that they're almost infallible for basic math stuff that humans find hard to do. Then they extrapolate that incorrectly to "all problems". It's also the case that for years a google search would something like 90% of the time get you to a page with "the right answer" for lots of general knowledge cases, and reddit and youtube increased that quite a bit for otherwise "hard to find forums" or "hard to show in text tasks".
So now they're extrapolating one generally OK but sometimes wrong thing with a completely different technology that is at best about 50/50% right. What I think I want from AI search is more a deeper scan of the results to try and derank the SEO garbage and process NLP much better than traditional "weird search language". But the AI summaries just often lead you enough astray that I'm then going on to search again anyway.
People should take AI with a huge grain of salt, but of course it's hyped to hell and gone. It is really slowly getting incrementally better at this point, and if the correctness doesn't have a huge jump up via some new tech I kind of see it basically stalling out at converting natural language into a rough draft of a search or of code or the like.
Cassandra complex away - people have been stumbling along forever with no real idea how any of the tools they use work - it's why we have stories like "The Time Machine" or various OG Star Trek eps, where people don't know how the "machine god" works anymore, and when it starts to malfunction they have no idea what to do. It means there's going to be people who bother to learn decent rules of thumb to have a better life, those who become experts and can repair the things, and those who move the tech ahead. But the masses will struggle to find information because they still don't really know how to do so.