?? Data Center Frontier Trends Summit 2024 Recap! ?? What an incredible year it’s been for #DCFtrends and the data center industry as a whole! From groundbreaking innovations and valuable connections to content-rich educational sessions, the inaugural 2024 #SoldOut summit was a standout. A huge thank you to everyone who made it possible – here’s to an even bigger and better 2025! Be the first to know when 2025 registration opens: https://bit.ly/4ehabd4 #DataCenterFrontier #DataCenterAlley #DataCenter #Energy #Cloud #DigitalInfrastructure #AI
Data Center Frontier
图书期刊出版业
Lawrenceville,NJ 21,267 位关注者
Data Center Frontier charts the future of data centers and cloud computing. We write about what’s next for the Internet,
关于我们
Data Center Frontier charts the future of data centers and cloud computing. We write about what’s next for the Internet, and the innovations that will take us there. DCF is a publication of Endeavor Business Media. The data center is our prism. We tell the story of the digital economy through the facilities that power the cloud and the people who build them. We track the impact of new technologies, including the Internet of Things (IoT), machine learning and artificial intelligence, virtual reality and edge computing.
- 网站
-
https://www.datacenterfrontier.com
Data Center Frontier的外部链接
- 所属行业
- 图书期刊出版业
- 规模
- 2-10 人
- 总部
- Lawrenceville,NJ
- 类型
- 私人持股
- 创立
- 2015
- 领域
- Data centers、Cloud computing、Journalism、Online Publishing和Social media
地点
-
主要
237 Glenn Avenue
US,NJ,Lawrenceville,08648
Data Center Frontier员工
-
Doug Mohney
Writing about all things technology, including broadband, IoT, data centers, and space/satellite. ABQ and the Cape potentially on the travel schedule…
-
David Chernicoff
-
Carrie Kirkbride
Sales and Marketing professional in B2B industries. MBA with an emphasis in Marketing.
-
Matt Vincent
Editor in Chief at Endeavor Business Media
动态
-
Data Center Frontier Trends Summit 2024: Photographs & Memories (#Slideshow) With top-flight speakers and unique, professionally-led interactive networking, the DCF Trends Summit (Sept. 4-6) was a vibrant experience for all involved. #datacenter #cloud #AI https://lnkd.in/gn7nKq2b
-
DCF Trends Summit: "Top 5 Data Center Trends to Watch for 2025" This in-depth article fully recounts discussions from the closing panel, as titled above, of the inaugural Data Center Frontier Trends Summit, held Sept. 4-6, 2024 in Reston, Virginia. Rich Miller? Chris Downie Daniel Crosby? Bill Kleyman ???? Erica T. Flexential Legend Energy Advisors Apolo? Liquid Cooling Coalition? Matt Vincent? Endeavor Business Media #dcftrends #datacenter #hyperscale #colocation #edge #cloud #AI #IT #digitalinfrastructure Read on: https://lnkd.in/eSy9EaZh
DCF Trends Summit: Top 5 Data Center Trends to Watch for 2025
datacenterfrontier.com
-
Hopping on a plane for the holidays? Do what our editors do and download a few #datacenter industry #whitepapers as offline reading for the ride. Browse our listing: https://lnkd.in/gnnAijDd #hyperscale #colocation #edge #AI #IT #liquidcooling #power #cloud #digitalinfrastructure
White Papers
datacenterfrontier.com
-
What's Up With Supermicro? Catching up on all the intrigue surrounding the enigmatic maker of #AI and #HPC #datacenter #infrastructure, who is a key NVIDIA partner. Read on: https://lnkd.in/e_P34z4n
What's Up With Supermicro?
datacenterfrontier.com
-
Absorbing #AI #datacenter #infrastructure advice and #GPU #engineering discussion from Rosendin
Turn Key Electrical Solutions Provider in the Energy, Data, Transportation and Commercial Building Industries
AI Electrical Overloading Is Real While anecodatal evidence of GPU-based processing impressing current overloads on the upstream electrical systems has proven true, this does not pose a significant obstacle to new facility design nor the implimentation of GPU-based systems in new or existing facilities. While not an obstacle, it's still a core consideration these days. Rosendin is currently engaged in the design+build delivery of three bepsoke AI-centric facilities. None of them are remotely alike, unlike commoditized, homogeneous cloud/CPU-based facilities. Cloud is an applicaition-centric world, while AI facilities are hardware-centric. This is not unlike mainframe compute from a couple of decades back. It's now about capacity and how the hardware runs. MWs are MWs. When looking at your AI loads, the persistent 180% FLA overload may spoof the electrical system into thinking there's more load than actually exists. This 180% assumes a harmonized processing bed, and with processors all working on a disaggregated basis, the actual current will be less. Overall compute workload below full utilization would also alleviate some of this. Since AI hardware doesn't conform to current cloud-based facility design, we all have to adapt. One of the those adaptations is addressing NVIDIA chip current fluctuations. Since we have to solve the issue, we ran the math. The 1.45 PUE model, now typical for the SE USA, is at the bottom of the post. Certainly, overall IT workload will help the overload condition. There is a divergence between learning and inference AI. With learning systems, they are grinding away almost continuously with a high utilization rate with much lower storage requirements than other IT applications. When you place AI in to an inferance application, we expect the utilization to dip. Current workarounds include: - Oversizing the upstream UPS system if it has an instaneous rating less than 200%. - Limiting the UPS system loading to 80%. - Placing some overload capability at the building or site substation level. Ask question of your end user, be careful on overdesign, watch your data hall system work and set low/high system sizing boundaries. Thanks for stopping by.
-
Blackstone Invests $500 Million in Lancium AI Campuses: Blackstone has taken an equity stake in Lancium, providing $500 million to support development of data center MegaCampuses, Bloomberg reports. At this year's Data Center World, Lancium's Ali Fenn laid out the company's vision for Gigawatt-scale campuses to support AI workloads with renewable energy. Fenn said Lancium hopes to enable new energy technologies and collaborate with utilities and grid operators. This includes providing “responsive” loads that can balance grid demand, making it easier to add solar and wind power, which are sustainable but intermittent. With Blackstone's backing, Lancium is positioned to build out its five planned campus sites. The first Lancium Clean Campus in Abilene will house a 206 MW data center backed by a joint venture of Crusoe, Blue Owl and Primary Digital Infrastructure. Bloomberg piece at DCK: https://lnkd.in/efCtGp8E My DCF piece discussing Lancium's vision: https://lnkd.in/eg_P7Gqp
-
Selecting the Right Coolant Distribution Unit for Your AI Data Center In this engaging guest article for DCF's 'Voices of the Industry' forum, Ian Reynolds, P. Eng., Senior Project Engineer with CoolIT Systems, outlines considerations for selecting the best CDU for a given AI data center's liquid cooling needs. For facility considerations, Reynolds advises: "Choosing a CDU starts with a thorough understanding of your facility's design and constraints. Consider the following: Facility Water Availability: The availability of facility water will drive the decision between the two main categories of row-based CDUs: liquid-to-liquid or liquid-to-air. Liquid-to-air CDUs are preferable if there is limited facility water available for ease of deployment. If facility water is already plumbed into the area where the CDUs will be installed (or is feasible to be brought in), liquid-to-liquid CDUs are preferable due to their increased capacity. Physical Deployment: The size and weight of the CDU must align with your building’s constraints, such as elevator capacity, floor loading limits, and available space. Proper assessment ensures smooth installation and operation, avoiding costly modifications or downtime. CDUs are designed to be installed among the server racks, within the data center hot/cold aisle, or outside the data center halls in the facility or mechanical room. Secondary Fluid Network (SFN) Buildout: Efficiently planning the connection between the row-based CDUs and the racks is crucial for efficient operation due to its impact on the overall system pressure drop, serviceability, and scalability of the system." The piece goes on with more advice on choosing the best CDU for your AI facility, providing similar breakouts for: CDU performance; Redundancy, Monitoring and Serviceability; and Manufacturing and Sourcing.? #datacenter #liquidcooling #infrastructure #hyperscale #colocation #CDU #AI #datacentercooling Read on: https://lnkd.in/gWPi58Sz
Selecting the Right Coolant Distribution Unit for Your AI Data Center
datacenterfrontier.com
-
Data Center Dynamics Debates Coalesce 2024 Industry Nuances, Views At this month's DatacenterDynamics (DCD) Connect event in Leesburg, Virginia, a series of debates were presented, each tackling some of the most pressing issues within the data center industry. #datacenter #hyperscale #colocation #edge #engineering #construction #sustainability #debate Black Box? Exyte? CloudHQ, LLC? Verizon? Digital Realty? Nautilus Data Technologies? Duos Edge AI? JLL? FLEXNODE? Hudson IX Arup? Akamai Technologies? Schneider Electric? eStruxture Data Centers? ENERCON? Google? RPower? Elea Data Centers? IFC - International Finance Corporation Digital Gravity? Apolo? Aligned Data Centers? ECL? HDR?? Waterfall Security Solutions Vantage Data Centers? Trane? Penn State University? OpenAI Read on: https://lnkd.in/dfdfimJk
Data Center Dynamics Debates Coalesce 2024 Industry Nuances, Views
datacenterfrontier.com
-
Elegant aerial drone camera stylings from the picturesque geographic #datacenter frontier in #Phoenix. Well played Aligned Data Centers
Major progress at our PHX-07 campus! Phoenix is one of the most dynamic and rapidly growing data center markets in the country: proximity to Los Angeles, relatively inexpensive power cost, and a 20-year sales tax exemption by the Arizona Commerce Authority that can save millions for companies that collocate here. Thank you to everyone involved for bringing this project to life.