Here’s what Studio Ghibli taught me about B2B SaaS…
Prefect
软件开发
Washington,District of Columbia 26,431 位关注者
Trust your workflows, from script to scale.
关于我们
Modern workflow orchestration for data and ML engineers
- 网站
-
https://www.prefect.io
Prefect的外部链接
- 所属行业
- 软件开发
- 规模
- 51-200 人
- 总部
- Washington,District of Columbia
- 类型
- 私人持股
产品
Prefect
数据科学与机器学习平台
Prefect is open-source workflow orchestration for building, observing, and reacting to your data pipelines. With Prefect, control and observe: 1. What code runs, and the data it handles 2. When it runs, scheduled or triggered by an event 3. Where it runs, and the systems it talks to
地点
-
主要
1200 18th St NW
US,District of Columbia,Washington,20036
Prefect员工
动态
-
?? Need to run 20,000 tasks in parallel? Prefect can handle it. Traditional orchestrators struggle at this scale. Prefect was built for it. Our CTO, Chris White, just published a deep dive into how task mapping in Prefect helps data teams move past loops, scale to tens of thousands of parallel tasks, and maintain full observability the whole way through. Whether you’re making API calls, processing customer data, or orchestrating machine learning pipelines, Prefect’s architecture makes large-scale parallelism not just possible—but practical. Here’s what makes it work so well in practice: ? Dynamically generate and run thousands of tasks ? Use structured retries, caching, and observability at the task level ? Swap task runners like Dask or Ray without rewriting your flows ? Keep workflows adaptive with runtime task discovery ? Leave static DAGs and scheduler bottlenecks behind This is more than just a syntax trick. It’s the result of intentional architecture designed to make massive orchestration feel simple. ?? Read the full post: ?? Beyond Loops: How Prefect's Task Mapping Scales to Thousands of Parallel Tasks: https://lnkd.in/geEJcTRx #DataEngineering #WorkflowOrchestration #Python #Prefect #ParallelProcessing #Scalability #TaskMapping #Dask #Ray #OpenSource
-
?? How do you run hundreds of concurrent Python tasks without chaos? You don’t. You use AnyIO. At Prefect, we orchestrate over half a billion task runs every month. That’s millennia of tasks, every single month. Async execution isn’t just part of the system. It?is?the system. And behind it is a library we trust deeply: AnyIO. Our VP of Product, Adam Azzam, just kicked off a new blog series:?Love Letters to the Tools That Power Prefect. The first one is for AnyIO. Here’s what makes it such a critical part of our engine: ? Structured concurrency that keeps cleanup and cancellation predictable ? One API that works with both asyncio and Trio ? Clean interfaces for task groups, cancellation scopes, subprocesses, and more ? Trusted by OpenAI, Starlette, Anthropic—and PrefectAsync in Python can get messy fast. AnyIO gives us a way to manage concurrency with clarity and control. ?? Read the first love letter: ??? How AnyIO Powers Prefect’s Async Architecture: https://lnkd.in/gxr6nJqK More to come. More libraries to thank. #Python #AsyncIO #AnyIO #Concurrency #WorkflowOrchestration #DataEngineering #OpenSource #Prefect
-
Prefect转发了
10 years ago I built XComs for Airflow, and they're still the primary mechanism for moving data between tasks today. That's not a good thing. As a data scientist, I was shocked to discover that Airflow had no way to pass information from one task to the next. I came up with XComs: a simple idea to serialize small pieces of metadata through the Airflow metadata database. XComs fundamentally break the Python programming model in ways that aren't immediately obvious. What you think is happening (data flowing directly between tasks) and what's actually happening (metadata being stored and retrieved through a database or storage layer) create a cognitive disconnect that ripples through your entire workflow. When data can't flow naturally, engineering teams are forced to build increasingly elaborate workarounds—using temporary cloud storage, creating intermediate tables, or implementing parallel message queues—all representing engineering overhead disconnected from business objectives. In 2015, XComs brought the illusion of data movement to Airflow – a necessary stepping stone at the time. In 2025, what was innovation has become limitation. The modern data ecosystem demands true Python-native workflows, not serialization checkpoints masquerading as data flow. Data teams need workflows that transcend artificial boundaries: ?? Share data natively between tasks using regular Python objects ?? Scale instantly based on demand ?? Run with true Python parallelism ?? Update independently without affecting other workflows ?? Test locally and deploy with confidence Prefect exists to solve these fundamental problems.
-
Prefect转发了
Anyone else in the process of moving? The excitement of finding a new place is great—new adventures, new spaces! But then, there’s the elephant in the room… Packing. Sorting. Hauling. The thought of boxing up everything makes me shake in my boots (and yes, I’m still in Nashville, so the boots analogy stands ??). Funny enough, switching from one piece of software to another feels just as overwhelming. Moving your workflows from one tool to another isn’t just time-consuming—it can be straight-up scary. But what if you had a storage unit to make the process smoother? That’s exactly what I walked through in a recent webinar—how to keep your Airflow tasks running while getting observability through Prefect. It’s like placing your workflows in a temporary home while you transition to something better. Have you ever had to migrate workflows between tools? Let me know how it went! And if you’re thinking about moving from Airflow to Prefect, check out the webinar on-demand and the GitHub repo: ?? https://lnkd.in/gsjPyHnU ?? https://lnkd.in/g9wshTw5
-
Missed our webinar, Observing Airflow DAGs in Prefect? Catch up on demand!? ???? ?? You'll learn how to monitor Airflow DAGs in Prefect with specific code you can copy-paste into your setup. By the end, you'll have the tools to use Prefect's alerting and error handling system to process the failures occurring within your Airflow data workflows https://prefec.tv/4htwu0f
-
We’ll be at Data on Kubernetes Day on April 1st in London! ?????? Whether you’re scaling complex workflows, optimizing infrastructure, or just curious about how Prefect makes data orchestration effortless, we’d love to chat. Drop a comment below if you'll be there too! #Kubernetes #DataOrchestration
-
One core difference between Airflow and Prefect is simple: Airflow is a framework that forces your code to follow its rules, while Prefect is a library that adapts to your Python code. With Prefect, workflows can spin up and down dynamically - even using CPUs or GPUs. Even better, these decisions happen at runtime rather than being locked in a predefined graph. Your workflows adapt to real conditions, making better use of resources while keeping code clean and efficient. ?? Dig into more reasons why modern teams choose Prefect over Airflow: https://prefec.tv/3FiHgcB
-
Step back and see the complete picture, not just individual workflows - with Lenses in Prefect. Lenses are our new approach to visualizing and comprehending your workflow ecosystem beyond the data ? ? Through shared resources, patterns and relationships emerge when these workflows interact that span beyond data lineage.?Lenses in Prefect transform how we visualize and understand complex data platforms.
-
Grab some peanuts and Cracker Jack for the second installment of our Moneyball Marvin series, as we build on our baseball data analysis foundation by creating an automated data pipeline to process new game data. This tutorial covers the complete process of setting up event-driven data workflows and creating a dashboard for baseball statistics. https://prefec.tv/3FobB9l
Building an Event-Driven Baseball Data Pipeline with Prefect, dbt, Snowflake & Hex
https://www.youtube.com/