Exploring Swift’s Operation Queues: Enhancing Threading with Limitations

Exploring Swift’s Operation Queues: Enhancing Threading with Limitations

Apple has introduced abstractions over threads to address the issues we discussed here. These include operation queues, Grand Central Dispatch, and Combine. Let’s delve into operation queues, understanding how they improve threading while also recognizing their limitations.

Operation Queues

Operation queues, introduced in macOS Leopard and the iOS 2.0 SDK, offer a different approach compared to threads. They separate the concept of how work is performed from the actual work itself, unlike threads where both were combined.

To start with operation queues, you create one and add operations to it for execution:

let queue = OperationQueue()
queue.addOperation {
 print(Thread.current)
}        

Operations can be added to the queue for simultaneous execution:

queue.addOperation { print("1", Thread.current) }
queue.addOperation { print("2", Thread.current) }
// … (adding more operations)        

However, similar to threads, the execution order in the queue is not guaranteed:

// Output might be in different order each time
1 <NSThread: 0x100904c20>{number = 2, name = (null)}
2 <NSThread: 0x100904c20>{number = 4, name = (null)}
// … (other thread numbers)        

Operations in queues share similar features with threads. For instance, you can set the priority of an operation using qualityOfService:

let operation = BlockOperation {
 print(Thread.current)
}
operation.qualityOfService = .background
queue.addOperation(operation)        

Cancellation of operations, similar to threads, is cooperative. You need to regularly check if an operation has been cancelled to halt its execution:

let operation = BlockOperation()
operation.addExecutionBlock { [unowned operation] in
 Thread.sleep(forTimeInterval: 1)
 guard !operation.isCancelled else {
 print("Cancelled!")
 return
 }
 print(Thread.current)
}        

However, cancellation may not immediately interrupt ongoing tasks within an operation. For instance, a sleeping thread within an operation continues despite cancellation:

operation.addExecutionBlock { [unowned operation] in
 let start = Date()
 defer { print("Finished in", Date().timeIntervalSince(start)) }
 Thread.sleep(forTimeInterval: 1)
 guard !operation.isCancelled else {
 print("Cancelled!")
 return
 }
 print(Thread.current)
}
// Output:
// Cancelled!
// Finished in 1.0988129377365112        

Unlike threads, operation queues lack built-in storage like thread dictionaries, preventing implicit data transfer within the operation. This limitation restricts passing data across the application layers.

Therefore, unlike threads, operation queues require explicit handling to pass data deeper into the application.

Operation queues address several issues associated with threads. They introduce a more nuanced way of coordinating operations, supporting dependencies to regulate the execution order. Let’s delve into how these queues handle dependencies and mitigate thread explosion issues.

Handling Dependencies

Operation queues allow for sequential execution of operations based on dependencies. For instance, consider operations A and B, where B depends on A:

let queue = OperationQueue()
let operationA = BlockOperation {
 print("A")
 Thread.sleep(forTimeInterval: 1)
}
let operationB = BlockOperation {
 print("B")
}
operationB.addDependency(operationA)
queue.addOperation(operationA)
queue.addOperation(operationB)        

Running this code will print “A” immediately, followed by “B” after a one-second delay. You can express complex dependency graphs by linking operations together. For example,

operationB.addDependency(operationA)
operationC.addDependency(operationA)
operationD.addDependency(operationB)
operationD.addDependency(operationC)
queue.addOperation(operationA)
queue.addOperation(operationB)
queue.addOperation(operationC)
queue.addOperation(operationD)        

The output demonstrates sequential execution with the flexibility for B and C to run in parallel, resulting in variable order due to the absence of a direct dependency between B and C.

Mitigating Thread Explosion

Operation queues optimize thread usage by managing thread creation more efficiently than naive thread creation approaches:

let queue = OperationQueue()
for n in 0..<workCount {
 queue.addOperation { print(n, Thread.current) }
}        

This code adds a substantial number of operations to the queue, but you’ll notice that the actual threads utilized are significantly less than the number of operations submitted. This behavior is akin to a thread pool, where threads are intelligently managed, reducing the overhead caused by excessive thread creation.

Limitations:

Operation queues offer some powerful features but also come with their set of limitations. Let’s explore some problems with operation queues and how they affect asynchronous work.

Blocking Operations

Operation queues lack native support for non-blocking asynchronous work. Consider waiting for a delay:

queue.addOperation {
 Thread.sleep(forTimeInterval: 1)
 print(n, Thread.current)
}        

This code uses thread sleep, which blocks a thread while waiting. It means the thread remains occupied solely for time delay, hindering efficiency.

Lack of Cooperation and Resource Sharing

Operation queues don’t facilitate cooperation among operations. Without a mechanism to yield resources to other tasks, operations compete for CPU time. Intensive operations can starve others, causing issues like the prime computation never getting a chance due to indefinitely running operations.

for n in 0..<workCount {
 queue.addOperation {
 print(n, Thread.current)
 while true {}
 }
}
queue.addOperation {
 print("Starting prime operation")
 nthPrime(50_000)
}        

Limited Cancellation and Odd API

Cancellation of an operation doesn’t automatically cancel its dependencies. Also, the API structure for setting dependencies might seem convoluted as it requires forward declaration of operations, defining dependencies, and adding operations to the queue separately.

Superficial API and Data Races

The API structure of operation queues might appear unconventional. It relies heavily on object-oriented paradigms, and reaching shared state within operations can still lead to data race issues similar to those encountered with threads.

Summary

In essence, while operation queues offer features for sequencing and parallelizing work, they lack robust mechanisms for non-blocking operations, resource cooperation, and seamless dependency management, leading to challenges like thread blocking, competition for resources, and complex API structures reminiscent of object-oriented design. Also, they don’t directly address multithreaded race conditions, leaving room for potential data race problems.

要查看或添加评论,请登录

社区洞察

其他会员也浏览了