[14th April 2024] Interesting Things I Learnt This Week
1. Opera adds built-in support for local LLMs - Opera is adding local AI models to its browser. Users can choose from 150 local LLM variants, keeping their data private on their device. This is part of Opera’s AI Feature Drops Program for early adopters. My Take : Locally running LLMs will become a commonplace very soon. It requires not just software but hardware support as well. On older hardware it might takes ages to get anything done. But on newer hardware it will be good I guess. The best software to get them integrated is going to be the browser for most folks IMHO . I was hoping Firefox to be the first one to do it, but I wish they work towards it. I have no hope of any of the big personal computing softwares to be doing it. They will push for only their models to be running and working across devices, also trusting them to not harvest data off it, would be a challenge. Never the less, integrating LLM APIs in browser will open up some interesting avenues for web applications.