Please Don’t Throw Your Smartphone, Yet

Please Don’t Throw Your Smartphone, Yet

Apple has quietly integrated multiple generative AI features into its products and services. For example, the latest Series 9 Watch, iPhone 15 and other devices now include Siri, which is based on a transformer-based language model. They also boast of an offline Siri – a first for Apple, alongside integrating diffusion models to Apple Silicon devices, among other enhancements.

But, don’t throw away your old phones yet.

Jim Yang, senior AI scientist at NVIDIA, stated that LLMs could change how smartphone keyboard functions, as LLMs are good at predicting the next word in a sentence. Interestingly, three months after Yang said this, Apple revealed something similar at its Worldwide Developers Conference 2023. The phone maker said autocorrect will use a transformer language model and an on-device ML model to make autocorrect better.?

Apple has spent nearly $22.6 billion on R&D, driven by investments in AI technologies, particularly generative AI. The tech giant is also burning millions of dollars a day to develop new features for Siri, where customers can use just voice commands to automate simple tasks and more.?

Besides Apple, Qualcomm also sees future growth driven by on-device machine learning workloads. Qualcomm chief Cristiano Amon believes that generative AI could breathe new life into smartphones. The chip giant is powering most Android smartphones, including Samsung, OnePlus, Oppo, Vivo, Realme, Redmi, Xiaomi, Lenovo, and Motorola, among others.

Many Chinese smartphone makers, including Huawei and Xiaomi, are already working on running LLMs directly on smartphones and bringing generative AI applications to end-user devices.

See how generative AI is changing the smartphone landscape here.


No iPhone 15's Satellite Feature in India

Apple’s newly launched iPhone 15 got a new, additional feature for its SOS satellite service – emergency roadside assistance. This new feature is currently available in 14 countries and regions. This includes Australia, Austria, Belgium, Canada, France, Germany, Ireland, Italy, Luxembourg, the Netherlands, New Zealand, Portugal, the UK, and the US, and will be available in Spain and Switzerland later this month.

But Indian users are not going to get it any time soon. Here’s why.


LLMs are Too Hot to be Cool?

The term temperature is gaining a lot of traction among prompt engineers and LLM developers, who are using it to make LLM-based chatbots like ChatGPT, Bard and others more creative, and sometimes even crazy.

No, we are not talking about physical temperature. Temperature is a concept borrowed from statistical physics and is integrated into the functioning of LLMs like GPT-4 or PaLM-2. This parameter allows users to adjust the balance between creativity and coherence when generating text. For instance, higher temperatures introduce more randomness, resulting in creative, yet potentially less coherent output. Lower temperatures, on the other hand, yield more deterministic and focused responses, emphasising coherence over creativity.

Read more here.


Bhashini's 'AI'volution

India, with 19,500 languages and dialects, has only 10% English speakers. Hindi, spoken by 43.63% of Indians, is dominant. However, a greater section of the population has different native languages as their mother tongue. In this complex scenario, prepping the country to lead the AI revolution is easier said than done.?

This is where Bhashini comes into the picture. This first-of-its-kind initiative by the Indian government leverages AI to break language barriers, enabling underserved communities to access government services seamlessly.?

Learn more about how Bhashini is truly democratising AI here.

要查看或添加评论,请登录

AIM Events的更多文章

社区洞察

其他会员也浏览了