MacDirectory magazine is the premiere creative lifestyle magazine for Apple enthusiasts featuring interviews, in-depth tech reviews, Apple news, insights, latest Apple patents, apps, market analysis, entertainment and more.
Issue link: https://digital.macdirectory.com/i/1522076
A risky move It’s a risky strategy for Google. It risks losing the trust that the public has in Google being the place to find (correct) answers to questions. But Google also risks undermining its own billion-dollar business model. If we no longer click on links, just read their summary, how does Google continue to make money? The risks are not restricted to Google. I fear such use of AI might be harmful for society more broadly. Truth is already a somewhat contested and fungible idea. AI untruths are likely to make this worse. In a decade’s time, we may look back at 2024 as the golden age of the web, when most of it was quality human-generated content, before the bots took over and filled the web with synthetic and increasingly low-quality AI-generated content. Has AI started breathing its own exhaust? The second generation of large language models are likely and unintentionally being trained on some of the outputs of the first generation. And lots of AI startups are touting the benefits of training on synthetic, AI-generated data. But training on the exhaust fumes of current AI models risks amplifying even small biases and errors. Just as breathing in exhaust fumes is bad for humans, it is bad for AI. These concerns fit into a much bigger picture. Globally, more than US$400 million (A$600 million) is being invested in AI every day. And governments are only now just waking up to the idea we might need guardrails and regulation to ensure AI is used responsibly, given this torrent of investment. Pharmaceutical companies aren’t allowed to release drugs that are harmful. Nor are car companies. But so far, tech companies have largely been allowed to do what they like. Some AI Overview results appear to have mistaken jokes and parodies for factual information. Image by Google / The Conversation