Liqid的封面图片
Liqid

Liqid

信息技术和服务

Westminster,Colorado 8,478 位关注者

Accelerate outcomes with new levels of IT efficiency, flexibility and agility.

关于我们

At Liqid, we are at the forefront of high-performance computing solutions, fundamentally changing how organizations tackle today's immense computational challenges. Our groundbreaking technology enables the world's highest-density GPU servers, designed to reduce costs through efficient resource utilization, pooling, and dynamic allocation of GPU resources. We go beyond just facilitating complex computational tasks; we are transforming the landscape of compute delivery, efficiency, and adaptability. With Liqid, businesses are empowered to redefine the limits of their computing capabilities, embracing a future of unparalleled performance and flexibility.

网站
https://liqid.com/
所属行业
信息技术和服务
规模
51-200 人
总部
Westminster,Colorado
类型
私人持股
创立
2013

地点

  • 主要

    11400 Westmoor Circle

    Ste 225

    US,Colorado,Westminster,80021

    获取路线

Liqid员工

动态

  • 查看Liqid的组织主页

    8,478 位关注者

    Congratulations to Liqid customer, Pienso, on its launch of Pienso Faun. Powered by Liqid software-defined composable infrastructure, and in partnership with NVIDIA and Dell Technologies, Pienso Faun?uses composable bare-metal infrastructure to support their vision: a garden of small language models waiting to be sculpted by the non-technical enterprise user, carefully optimized and deployed for fast inference and robust quality. Pienso Faun delivers on premises AI infrastructure and a smarter way for organizations to build and deploy their own language model garden. Now, enterprise teams can dynamically compose GPUs, CPUs, and storage on demand into a solution that is integrated with a garden of smaller language models. Pienso Faun lets enterprises create, experiment, and deploy AI models based on actual needs and workloads. The result is optimized language model performance, balanced with powerful, agile, and efficient infrastructure that eliminates the risk of costly over provisioning and ensures seamless hardware upgrades. Congratulations again. We’re excited about composing the future together. (link to Pienso's announcement blog in comments) #aiinfrastructure #liqid #composableinfrastructure

    • 该图片无替代文字
    • 该图片无替代文字
  • 查看Liqid的组织主页

    8,478 位关注者

    Honored to be part of your strategic tech team, Artist Anywhere! Congratulations on the launch. #liqid #composableinfrastructure

    查看Artist Anywhere的组织主页

    401 位关注者

    Congratulations to the VFX Production team of Leah Garner Orsini, Douglas Purver, and Tessa Rittersbach, on the successful launch of Zero Day, now available for streaming on Netflix! Thanks for your support! The Zero Day VFX team utilized our Artist Anywhere Production Pipeline, incorporating Data IO automation tools, cloud workstations, high-performance production storage, and our Cyclone delivery system. Additionally, we supported the in-house team of Artists running Nuke on our bare metal cloud systems. Special team recognition goes to Den Serras, Aparna Ghagre, and The Redesign Group Cloud Engineering team for their valuable contributions to this project. The Artist Anywhere cloud platform owes its success to our strategic technology partners: Autodesk, VMware, Dell Technologies, pixitmedia, Liqid, NVIDIA, Amazon Web Services (AWS), Wasabi Technologies, Equinix, and Okta. #VFX #CloudStudio #DataIO #Automation

    • 该图片无替代文字
  • 查看Liqid的组织主页

    8,478 位关注者

    Let's SKO! This week, the Liqid sales, marketing, product, engineering, and leadership teams gathered to learn, share, align, and set the stage for our best year ever. A few key takeaways: ? A renewed focus on focus...focus on key verticals where we have repeatedly demonstrated the value of software-defined composable infrastructure (higher education, financial services, government, media & entertainment, and service providers offering GPU-as-a-Service) ? Strategies to accelerate growth in the AI Inference infrastructure market, GPU pooling and sharing, CXL memory, HPC, and VDI. ? The power of being present, teamwork, execution, and resilience. A massive thanks to our sales leaders, product experts, and special guest speakers for bringing their important perspectives and outside-in insights to the event. And, of course, to our sales team—the real drivers of our success—get ready to make 2025 our best year yet! #SalesKickoff #SKO2025 #Liqid #composableinfrastructure

    • 该图片无替代文字
    • 该图片无替代文字
    • 该图片无替代文字
    • 该图片无替代文字
  • 查看Liqid的组织主页

    8,478 位关注者

    Everyone knows the impact of DeepSeek was felt far and wide across financial markets. However, the DeepSeek effect and rapid advancements that new models will have on?IT infrastructure are only beginning. What we once thought was a performance-only race has evolved to demand not only compute and memory but ultimately more flexibility. So that means: 1?? GPUs alone won’t cut it – While GPUs have been the go-to for AI workloads, scaling efficiently now requires a more composable approach to infrastructure. Dynamically pooling, sharing, and reallocating GPUs, CPUs, and memory is becoming essential. 2?? The memory wall and bottlenecks are real – LLMs are memory-hungry. Traditional infrastructure often struggles to provide the bandwidth needed to avoid slowdowns. High-speed interconnects and real-time composability will be key. 3?? Sustainability continues to be a concern – As energy consumption soars, enterprises must rethink power efficiency. Smarter workload orchestration and infrastructure optimization can help curb escalating operational costs. ??? Composable infrastructure uniquely allows IT teams to dynamically allocate resources based on workload needs—without overprovisioning. This means more efficiency, enhanced AI engagement, and reduced costs, all while delivering maximum performance when needed. The question isn’t if infrastructure needs to evolve—it’s how fast organizations can adapt. How is your team preparing for the next wave of models in the post-DeepSeek era and AI-driven workloads overall? If you want some guidance, check out the new Liqid blog linked in the first comment. www.liqid.com #composableinfrastructure #deepseek #AI #cdi

    • 该图片无替代文字
  • 查看Liqid的组织主页

    8,478 位关注者

    ???????????? ????????????????: ?????? ???????????? ???? ????????-???????? ???? ???? ?????????? & ???????????? Liqid is now listed on ????????????.?????? as a key Holoscan infrastructure partner. Nvidia's Holoscan is redefining real-time AI for video, imaging, and edge computing. To power these demanding workloads, NVIDIA has listed Liqid as a key infrastructure partner, enabling dynamic ?????? ?????? ?????? ?????????????????????????? for next-gen AI applications. Why Liqid for Holoscan? And what does this mean for the future of AI-driven video and edge computing? Drop a comment or DM us if you’d like to take a deeper dive! #AI #NVIDIA #Holoscan #Liqid #EdgeComputing NVIDIA?Holoscan for Media application:?https://lnkd.in/gh9qsTrB

    • 该图片无替代文字
  • 查看Liqid的组织主页

    8,478 位关注者

    Day 1 of AI Everything is in the books. If you thought you knew AI, TRUST US, we have barely scratched the surface. Every speaker emphasized that we are at the starting line of what’s possible. Discussions ranged from…??DeepSeek (obviously) to small language models,????open source, ??power consumption concerns, ??movie magic, ??monsters, ??aliens, ??robots, ??♂?zombies, ???agents, ??ethics, ??data sovereignty, the good ??, the slightly scary ??, and so much more. And more robots ?? thanks to Kate Darling and Mr. Spaghetti ??. Of course, we were encouraged to hear a significant focus on AI inferencing. Attendee consensus agrees that we’re shifting from building to engaging with models both at a personal level and in business. Liqid is ready for this transition and we had a number of meaningful conversations about how our software-defined composable infrastructure for GPUs and memory is poised to help make AI inferencing perform blazing fast, while alleviating some of those other challenges such as power consumption that comes with traditional infrastructure deployments. Leading one of the conversations about AI infrastructure was our CEO Edgar Masri who posed the question about the future of AI infrastructure to His Excellency Omar Sultan AlOlama, the UAE’s Minister of State for Artificial Intelligence, Digital Economy, and Remote Work Applications. Liqid’s Chairman Ashish (Ash) Puri highlighted more on the DeepSeek effect and inferencing against more efficient models in his conversation with Anna Gressel. We’re looking forward to days two and three. More to come…literally! #liqid #AIinferencing #AI #AIeverything

    • 该图片无替代文字
  • 查看Liqid的组织主页

    8,478 位关注者

    Liqid is looking forward to the AI Everything Global event in Abu Dhabi and Dubai. The AI Everything Global Summit is happening at The St. Regis Saadiyat Island Resort, Abu Dhabi, on 4 February, followed by the expo on 5?6 February at Expo City Dubai. ?? Our CEO, Edgar Masri, will be joining Ashish (Ash) Puri from Lightrock and Rhett Power from Forbes on the 10X stage at the expo on Wednesday, 5 Feb at 11:00am ?? Ash will also be offering his perspective on the DeepSeek AI effect on the Summit stage on Tuesday, 4 Feb at 2:10pm ?? As an invited AI50 start up, Liqid will be showcasing our software-defined composable infrastructure solutions that deliver flexible, high-performance, and efficient datacenter and edge solutions for AI inferencing in the enterprise, financial services, higher education and research, telco providers, government, and media & entertainment. Please reach out for meetings at the event and visit Liqid at the expo in Hall 3: Stand P57 (H3-P57). #composableinfrastucture #AIEverythingGlobal #liqid

    • 该图片无替代文字
  • 查看Liqid的组织主页

    8,478 位关注者

    Everyone knows that a picture is worth a thousand words. But, pictures showing successful installs of composable NVIDIA GPUs in a Liqid chassis up and running inside a happy customer's datacenter are absolutely priceless. Thank you to James Holdsworth for your continued partnership and vision for what software-defined composable infrastructure can achieve in bringing compute closer to the data for higher performance and efficiency. And as you said recently, “we’re just getting started!” www.liqid.com #composableinfrastructure #liqid #gpu

    • Liqid software-defined composable infrastructure installed in a customer datacenter.
  • Liqid转发了

    查看Ashish (Ash) Puri的档案

    Partner | Growth Equity | Impact Tech | Climate

    Fantastic to see Liqid, a Lightrock portfolio company, listed on the NVIDIA website as a key infrastructure partner. The partnership supports Nvidia’s application called Holoscan for Media, which allows broadcast, streaming, and live sports companies to run live video pipelines on the same type of infrastructure as AI applications. Holoscan (https://lnkd.in/e2PY3pfv) utilizes substantial GPUs, which is where Liqid Composable Infrastructure steps in, dynamically swapping GPUs without impacting the servers’ capacity. Check out the link in the comments below. Scroll down to Partner Ecosystem, Building the Future of Live Media, and click on the Infrastructure Partners tab. #GPUs #ComposableInfrastructure #DataCenters #NVIDIA #Liqid #Holoscan #FutureOfCompute

    • 该图片无替代文字
  • 查看Liqid的组织主页

    8,478 位关注者

    Software-defined composable GPUs AND Memory…it’s like peanut butter on jelly on AI Inferencing gold! Matching DRAM with GPUs for successful AI inferencing is table stakes. Software-defined composable infrastructure is no-so-secret sauce for the best match. ?? Faster data access means less waiting and avoiding slowdowns: GPUs are built to process tons of information at once. If there isn’t enough memory or there isn’t enough memory bandwidth, the GPU has to wait, which slows everything down, affecting how quickly AI can make decisions. ?? Handling Big Data: Like we said, AI models are all about large amounts of data. If the memory can't load and store a LLM fast enough or process RAG, delays happen and the system (and the business) slows, which is especially problematic when time is critical, like in real-time AI applications. ?? Fueling Growth: As AI systems get more advanced, they need more memory to keep up. Matching the memory properly ensures that as things get bigger, the system can handle it without slowing down. ??Saving Power: Efficient memory usage also helps save energy, which is especially important in large AI setups where power costs can add up. Liqid’s vision for software-defined composable infrastructure matches CXL-composed memory with GPUs, while maintaining performance and adding unique flexibility to ensure that AI systems work as effectively as possible - making real-time decisions and handling large amounts of data without delays, which is essential for maximizing AI success. www.liqid.com #composableinfrastructure #liqid #cxl #aiinferencing #peanutbutteronjellyongold

    • 该图片无替代文字

相似主页

查看职位

融资

Liqid 共 7 轮

上一轮

未知

US$22,257,821.00

Crunchbase 上查看更多信息