?? Discover 130K+ AI Models on FriendliAI's New Model List Page! We’re excited to launch our new Model List Page, your go-to hub for exploring and deploying over 130,000 AI models. From language models to image generation and multimodal tasks, find the perfect model for your needs and deploy it with just one click! ? What’s New? ?? Diverse AI models for text, audio, video, and more ?? Seamless integration from Hugging Face ?? Optimized models for faster, cost-effective performance ?? Easy access to Friendli Dedicated and Serverless Endpoints Start exploring today and take your AI projects to the next level! ?? Read more: https://lnkd.in/gDqpRgvg #GenerativeAI #AIModels #MultimodalAI #ModelDeployment #FriendliAI
FriendliAI
软件开发
Redwood City,California 2,215 位关注者
Accelerate Generative AI Inference—fast, efficient, and reliable generative AI inference solution for production
关于我们
Supercharge Generative AI Inference Efficient, fast, and reliable generative AI inference solution for production
- 网站
-
https://friendli.ai/
FriendliAI的外部链接
- 所属行业
- 软件开发
- 规模
- 11-50 人
- 总部
- Redwood City,California
- 类型
- 私人持股
- 创立
- 2021
- 领域
- Machine Learning、Deep Learning、Software Platform、Artificial Intelligence Platform、Generative AI、Cloud Service、SaaS、Inference Serving、Fine-tuning、LLM Serving、Inference和AI Agent
地点
FriendliAI员工
动态
-
Effortlessly Deploy Multi-Modal Models to FriendliAI Directly from Hugging Face! We're excited to announce that Friendli Endpoints, previously available as a deployment option for language models on Hugging Face, now supports multi-modal models as well! Enjoy lightning-fast inference on Friendli Endpoints with just a single click on Hugging Face. Deploy faster and scale smarter with FriendliAI! ?? Learn more??? https://bit.ly/3XSTb71 #AI #MultimodalAI #AIInference #HuggingFace #FriendliAI
-
?? The countdown to #NVIDIA #GTC2025 is on, and FriendliAI is ready to showcase the next era of generative AI inference! Join us as we connect with AI pioneers, dive into cutting-edge innovations, and demonstrate how FriendliAI optimizes AI model performance with industry-leading speed, cost efficiency, and operational simplicity. ?? Meet us at Booth #3212 | San Jose Convention Center ?? Book a meeting with our team: https://lnkd.in/d4-McKtp ?? Explore our solutions: https://friendli.ai Let’s unlock new possibilities in generative AI inference—see you at GTC! ?? #FriendliAI #GTC25 #AI #GenerativeAI #Inference
-
-
?? Accelerate Your AI Voice Agents with FriendliAI! ?? What makes a great AI voice agent? Speed, consistency, and the ability to scale effortlessly. ?? FriendliAI delivers on all fronts with three key advantages:? ?? Lowest TTFT(Time to First Token) for near-instant responses?? ?? Best TPOT(Time per Output Token) for natural conversations ?? Consistent Performance to keep your operations running smoothly Explore our blog post ?? https://bit.ly/3DnHG0H Find out more ?? https://friendli.ai #FriendliAI #AIVoiceAgent #AIAssistant #AgenticAI #Inference
-
?? New Blog Alert – The Complete Guide to the Friendli Container AWS EKS Add-On! Unlock new levels of performance by subscribing to Friendli Container on Amazon EKS! This seamless integration supercharges your Generative AI inference, boosting both efficiency and scalability in EKS with: - Over 50% reduction in GPU usage - Over 2x lower latency - Over 2x higher throughput Deploy Friendli Container in your EKS workflow today! ?? Check it out on AWS Marketplace: https://lnkd.in/divtpjzX ?? Read the blog: https://lnkd.in/gjBQMCpB #AWS #EKS #Kubernetes #Containers #Inference #GenerativeAI
-
DeepSeek-R1 is now available on Friendli Serverless Endpoints! ?? Here’s why teams are excited to integrate DeepSeek-R1 into their workflows: ? Blazing-fast speeds combined with high-quality responses? ?? Privacy-first hosting in a secure, US/Europe-based data center? ?? Support for maximum context lengths of up to 164k tokens? ?? Flexible, hassle-free pay-as-you-go pricing Try DeepSeek AI now with a free credit ?? https://lnkd.in/dmEaWhhe Check out our blog ?? https://lnkd.in/d5HgrEDE #FriendliAI #DeepSeekR1 #ServerlessAPI #LLM #Inference
-
?? Meet FriendliAI at NVIDIA GTC 2025! Join us at booth #3212 to discover how FriendliAI accelerates your AI models with unmatched speed, cost efficiency, and operational simplicity. ?? Book a private meeting to discover how Friendli Endpoints can transform your AI operations! - Byung-Gon Chun (CEO) - Chang-Yup Kim(VP of BD/Sales) - Yunmo Koo (ML system engineer) ?? Book your meeting now:https://lnkd.in/d4-McKtp ?? Visit our website: https://friendli.ai ?? Read our solution brief: https://lnkd.in/dNpyupUr #FriendliAI #NVIDIA #GTC #GenerativeAI
-
-
??? New YouTube video alert! Learn how to deploy Hugging Face models on Friendli Endpoints! Friendli Endpoints is linked directly from Hugging Face model hub, allowing for direct model deployment ?? Watch video ?? https://lnkd.in/dV4u82Wz Try deploying Llama 3.3 70B now ?? https://lnkd.in/dx6bzabf #FriendliAI #HuggingFace #OpenSource #Deployment
Deploy Hugging Face Models on Friendli Endpoints!
https://www.youtube.com/
-
FriendliAI spoke at Tokyo AI (TAI)’s meetup, alongside with Liquid AI, Sakana AI, and Amazon Web Services (AWS). In the session, our CEO, Byung-Gon Chun, showcased GPU-optimized inference architectures powering record-breaking AI speeds—now accessible through Friendli Endpoints on Hugging Face. ?? We’re excited to collaborate with leading companies in the Tokyo AI community to drive innovation together! Ready to deploy Llama 3.3 70B? Try it now ?? https://lnkd.in/dBydwhpg #TokyoAI #FriendliAI #LiquidAI #SakanaAI #AWS #LLM #AIInference
-
-
??Join Zilliz’s live webinar this Thursday, February 6! In this session, Soomin Chun from FriendliAI will walk you through the complete developer workflow—from fine-tuning models to production deployment. Learn how to create AI agents with cutting-edge multimodal RAG capabilities using Friendli Serverless Endpoints and Milvus DB. Register now ?? https://lnkd.in/g_JqNyV4
-