This article covers learnings, insights and pointers that I heard in "Open Source India" 20th edition conference in Bengaluru this week from industry experts, practitioners and enthusiasts.
I will try my best to provide exact details, terminology, links and references so that you would find it useful to understand, adapt and apply.
Open Source Software
- Open Source Software contributed to Mars project, UPI, Agriculture innovations, Telehealth ICU which was widely used during Covid and tons of Gaming projects. What a time to be a Software Developer to grab the opportunity and see if your code can touch the world.
- Contribute to Open Source whenever possible and it is apt for all young programmers who wants technical challenges which they might not get in early days of career in tech companies.
- Speaker from GitHub presented some "awesome insights" like they count 13 commits per second on GitHub across the globe, India has some X million developers on GitHub, and top companies (Ex: Microsoft, who would have thought this would happen 10 years back!) are actively contributing to Open Source across technologies and many more. Check this link - https://innovationgraph.github.com/
- Open Source Software is truly becoming more and more secure by being transparent. Community is making "code, documents, algorithm, engines available" for everyone to see, understand and leverage. There are very few black boxes in Open Software world.
- Working as "Open Source Contributor" keeps you at advanced level as you see how industry is moving, envisioning things and most importantly practicing.
GenAI - Part 1
- With GenAI tools, LLM Models been freely available or with very little investment, there is "no barrier for any company" NOT to try out a PoC for their core problems or where there is HUGE opportunity.
- Your PoC team for sure would be able to show something working in quick time. In parallel, ensure you have a TEAM of quality, regulatory, validation, security, and privacy on HOW to take the solution to production.
- Do remember, unless and until you take something to production there is very little change to your business model, user experience or competitive edge.
- Upskilling is super important in GenAI space and preferred way is, to do it with current members. Tell your Training & Development team that you are doing it for your CUSTOMERS.
- AWS has this amazing tool called "AI Use Case Explorer" to explore usecases in your area of interest. Check this link - https://aiexplorer.aws.amazon.com/
LLM customization can be done using below options
- Prompt engineering : There is enough buzz on this role but just think the power with which you can save tokens, money and time if you can ask the right question first time and get your answer.
- Adjusting Parameters : Temperature (Hint - it's not GPU temperature but something to do with creativity), and Number of tokens are very easy to relate. Check this link to learn more with examples. https://learnprompting.org/docs/basics/configuration_hyperparameters
- RAG, Retrieval Augmented Generation : RAG is claimed to be THE approach to bring your own data and use it with LLMs without spending time & money on fine tuning. Check these links to understand the concept and how to apply it in your solution 1) https://vitalflux.com/retrieval-augmented-generation-rag-llm-examples/ 2) https://www.youtube.com/watch?v=J_tCD_J6w3s
- Fine tuning : This is expensive and invest on it if you have domain specific data, use case and money to create USP for your business.
Gen AI - Part 2
- In this whole GenAI flow, we need human intelligence to define what you want and "LLM agents" provide that. They are a combination of LLM + Toolkit as SDK as we love to call them. They help in planning, calling your code, apply logic and react through a flow that you control. Langchain - Is super popular and used by almost everyone. Semantic Kernel - Seems equally good with addon support for .Net and Java.
- Presenter from Microsoft mentioned that they use Semantic Kernel behind Bing search, co-pilots in office 365 applications. Check this link https://learn.microsoft.com/en-us/semantic-kernel/overview/
- When you are trying out GenAI, think "you are toddler ready to go to swimming pool. Enjoy the experience, excitement to try out anything but do remember you have to go to water with protective gear. Aligning with your company policies, guidelines is your protective gear and it is MUST.
- GPUs, AI chips and Hardware innovations are playing a huge role in this space. You would be amazed seeing "GPUs/AI Chips ability to do parallel processing" at unimaginable scale. Kudos to all innovators in this space, after all software runs on hardware and FOUNDATION is super important. Check this link if you want to learn more https://www.run.ai/guides/gpu-deep-learning
- Four things are super important when you design your GenAI pipeline. Those are Architecture, Data Privacy & Security, Feedback loop and Accuracy.
- Have "realistic expectations on Accuracy" and ensure you add HUMAN in the loop for critical solutions. That ways you will get a BUY-IN from your customers and quality & regulatory team.
- General LLM models would suffice for some of the use-cases very easily. All you need to do is a PoC, ask your end-users to try it out and iterate on improvements. Post that, you make decisions on what you need to plan, design and deliver.
- Who doesn't want to deliver Good UX? In GenAI world, it's a combination of MODEL metrics and APPLICATION metrics.
Gen AI summary
- This opinionated summary from Thoughtworks team provokes your thoughts. Sit with your team and discuss.
Emergent abilities
There are some references to emergent abilities of LLM models throughout the event. Check this link if you are curious what it means for the future as we are imagining today - https://www.scientificamerican.com/podcast/episode/should-we-care-about-ais-emergent-abilities/
Lighter moments
Presentation is also about keeping the auditorium lively. Some examples :)
- Presenter from AWS said as he was facing issues to connect to his laptop to projector, "We are yet to find a solution for projector issues even though we are doing GEN AI :)"
- There is a new verse now as per a presenter from Unisys. It's called "LLM verse" :)
- As per one of the presenters from Thoughtworks, Fine tuning of LLMs is not really FINE as it involves millions :)
- One of the presenters who delivered session just before lunch time, kept first slide as "What is in the lunch MENU" than a boring AGENDA slide :)
Hope you found this article useful in providing clarity and curiosity. Thank you for reading.