Read Before Monday #29

Read Before Monday #29

Another week and we're already closer to Autumn... In this edition we review OpenAI's o1-preview and o1-mini models, emphasising their enhanced reasoning but noting issues with cost and transparency due to hidden "reasoning tokens." Then we focus to the inadequate testing of medical AI, calling for real-world trials and transparency. !984 called and they want Larry Ellison's AI-powered surveillance vision, which sparks concerns about privacy and overreach. Also, the FTC criticises social media platforms' data practices, pushing for regulation - I'm shocked (narrator: he's not!). Lastly, did you know that tech companies like Spotify exploit nostalgia, driving price hikes while consumers remain attached due to curated memories? See you next week :)

___

Simon Willison's article discusses OpenAI's new models, "o1-preview" and "o1-mini," which focus on improving reasoning capabilities via a "chain of thought" approach. These models excel in tasks that require complex reasoning but introduce trade-offs in terms of cost and response times. They also feature "reasoning tokens" that are hidden but billed, leading to some transparency concerns. The models are best suited for intricate tasks like crossword solving, but their full potential is still being explored.

  • My take: A lot has been written this week about the new OpenAI model that has reasoning and chain of thought and I didn't want to go with the news around it but rather someone opinion. OpenAI has been doing what startups and companies should be doing, improving and iterating on their MVP by adding new features with every cycle and consolidating previous ones. Now with this new model, there are quite some interesting quirks, like the reasoning tokens - aka you pay more - and the trade-off includes longer processing time. It's limited and it's not the 4o, but it's another feature that people can explore and be familiar with. For me the bigger issue is with reasoning tokens that are hidden and ultimately reduces transparency. In the end we're all still trying to figure out use cases for all these new capabilities - and that's the fun part of it!. Here's some more detail about it .

___

The?Nature?article discusses how medical AI algorithms are being tested inadequately, with many receiving regulatory approval based on limited clinical data. This raises concerns about the real-world effectiveness and safety of these technologies. Experts propose that comprehensive, transparent clinical trials and real-world testing are essential to ensure these tools work reliably in diverse medical settings.

  • My take: Still on the topic of AI, this article from Nature is quite interesting and informative about the usage of AI in Medicine, particularly how the algorithms haven't been tested properly. Again, transparency is key when it comes to GenAI/AI and if we don't have it, the trust relationship is broken, which leads for regulatory hurdles and safety problems in the near future. So yes, real world testing is essential for AI to success in the Medical fields, but also we need to understand the complexity of the ethical impact and properly inform patients.

___

Oracle CEO Larry Ellison’s details its vision for an AI-powered surveillance system , where always-on cameras in police cars, schools, and drones stream data to Oracle for AI analysis. Ellison claims this will improve public safety by constantly monitoring and reporting activities, but the piece highlights privacy, bias, and legal concerns, emphasising that such systems already exist but have failed to reduce crime or address deeper societal issues.

  • My take: Oh boy, oh boy!? Larry Ellison envisions a widespread AI-driven surveillance system using always-on cameras to monitor public spaces, where Oracle’s technology will include body cameras, drones, and school security systems, with data analysed by AI. 1984 was supposed to be a warning, not an instruction manual! - and it's quite right. We know surveillance tech promises safety but it has been with inconsistent reduced crime, having an overnighting eye creates privacy concerns despite claims of enhancing accountability. Police behaviour with body cameras still remains unproven for many and the always watching narrative conflicts with issues like misidentification and bias in current AI systems. It's a mess and having tech moguls planning to restrict privacy without broader oversight is scary. We need to clearly read 1984 more often to ground us to the potential danger.

___

The Verge highlights a damning report by the FTC on social media platforms’ data collection practices, revealing widespread surveillance and indefinite retention of user and non-user data. Companies like Facebook, YouTube, and others profit heavily from this data but often fail to protect users' privacy. The FTC argues that self-regulation has been ineffective and calls for comprehensive privacy legislation. The report also finds that companies collect data from external sources and sometimes do not fully comply with data deletion requests, even for minors.

  • My take: It's pretty grim, but we already knew it. Self regulation doesn't work. Never has. When you have a strategy where data collection is incentivised by highly profitable business models, then there's no pressure to have it regulated. Then data often comes from third-party sources like advertisers or data brokers and all of it raises privacy concerns due to recent failures. Then when we add data deletion requests that aren't fully honoured, then it's a huge pile of fire that is hard to break. Yes, many platforms fail to adequately protect children and teens privacy and that should be quite scary for everyone - not just parents.

___

Tech companies like Spotify, capitalize on user nostalgia , allowing them to raise prices significantly. Spotify, despite recent price hikes far above inflation, retains customers due to their emotional attachment to curated playlists. The platform, now profitable, benefits from low churn rates, but users remain vulnerable to price increases since they don't own their digital content. This highlights the shift from physical ownership (like records) to dependency on tech companies for access to personal memories.

  • My take: I usually bring you every week a nostalgia post of either a product, software of something we used to do. But now here's a breakdown how companies are monetising that revivalism of old sake. It's not only Spotify, but also Netflix with some of their series. I've recently changed to Apple Music and stopped using Spotify at all. I don't like constant price hikes without providing more value. I get that anything related to Nostalgia drives retention, but there's so much digital abundance now that we need to expand our horizons constantly. And yes, I've been drawn to vinyl ... And my first "new" album was The Prodigy, Music for the Jilted Generation .

___

This Week in GenAI

Our live this week was focused on the new GenAI assistant from Amazon called Amelia , the promise of GenAI for Fintech and finally Veo for Youtube Shorts .

In other news:

要查看或添加评论,请登录

社区洞察

其他会员也浏览了