Efficient Resumable Downloads in Flutter with Dio
Efficient Resumable Downloads in Flutter with Dio

Efficient Resumable Downloads in Flutter with Dio

As app developers, we sometimes need to handle large file downloads, which can be a challenge, especially in unreliable network conditions. One common problem is resuming downloads after they’re interrupted. Imagine downloading a large file and having to start over every time the network cuts out—that’s frustrating!

In this post, I’ll walk through how to implement resumable chunked downloads in Flutter using Dio. We’ll break the download into chunks, resume incomplete downloads, and even merge the chunks back into a full file once completed.

Why Download in Chunks? ??

Downloading files in chunks, such as 5MB at a time, is an efficient way to:

  • Resume interrupted downloads: If a connection drops, we can start from where we left off instead of restarting the entire download.
  • Download in parallel: We can download multiple chunks simultaneously to speed up the process.
  • Reduce bandwidth waste: Instead of starting over, you only need to download what’s missing, reducing the overall data usage.


The Step-by-Step Approach ??

Let’s break down the process of implementing this solution in Flutter using the Dio package:

1. Requesting Storage Permissions in Android ??

Before downloading large files, make sure to ask for the necessary storage permissions in your AndroidManifest.xml file. Since we are saving files externally, include these permissions:

<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE"/>
<uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE"/>
<uses-permission android:name="android.permission.ACCESS_MEDIA_LOCATION"/>
<uses-permission android:name="android.permission.MANAGE_EXTERNAL_STORAGE"
    tools:ignore="ScopedStorage" />        

To ensure legacy storage access (especially for devices running Android 10 and above), add:

xmlns:tools="https://schemas.android.com/tools"
android:requestLegacyExternalStorage="true"        

This will allow you to manage files in external storage correctly.


2. Get the File Size from the Server ??

The first thing we need to do is get the total size of the file we want to download. We use a HEAD request for this, which returns the file’s metadata, including its size, without actually downloading it. Knowing the total file size allows us to calculate how many chunks we need.

Future<int> getFileSize(String url) async {
  try {
    var response = await dio.head(url);
    return int.parse(response.headers.value('content-length') ?? '0');
  } catch (e) {
    throw Exception('Failed to get file size: $e');
  }
}         

3. Define Chunk Sizes ??

Let’s define the size of each chunk. For example, a 5MB chunk size is common for resumable downloads.

final int chunkSize = 5 * 1024 * 1024; // 5MB in bytes.          

Based on the file size, we can calculate how many chunks we need and define the byte range for each chunk.

4. Download Each Chunk Efficiently ??

Now, we loop through the file, downloading each chunk using Dio’s download method with a Range header. This header allows us to request only a specific byte range of the file. If the file is partially downloaded, we’ll resume from the byte where it stopped.

Here’s where we check if the chunk is already downloaded or partially downloaded, and handle it accordingly:

Future<void> downloadChunk(String url, String filename, int chunkIndex, int start, int end) async {
    String tempFilePath = await getFilePath('$filename.part$chunkIndex');
    File tempFile = File(tempFilePath);

    // Check if chunk already exists
    int downloadedBytes = 0;
    if (await tempFile.exists()) {
      downloadedBytes = await tempFile.length();
    }

    // If chunk is fully downloaded, skip
    if (downloadedBytes >= (end - start + 1)) {
      print('Chunk $chunkIndex is already fully downloaded, skipping.');
      return;
    }

    // Resume partially downloaded chunk
    int rangeStart = start + downloadedBytes;
    try {
      await dio.download(
        url,
        tempFilePath,
        options: Options(
          headers: {'Range': 'bytes=$rangeStart-$end'},
        ),
        deleteOnError: false, // Keep the file if download fails
      );
    } catch (e) {
      print('Failed to download chunk $chunkIndex: $e');
    }
}        


5. Parallel Downloads for Speed ??

To speed things up, we can download all the chunks in parallel. Each chunk is assigned to a separate download task and we wait for all tasks to finish.

List<Future<void>> downloadTasks = [];
for (int i = 0; i < totalChunks; i++) {
  int start = i * chunkSize;
  int end = start + chunkSize - 1;
  if (end >= fileSize) {
    end = fileSize - 1;
  }
  downloadTasks.add(downloadChunk(url, filename, i, start, end));
}

await Future.wait(downloadTasks);        

6. Merging the Chunks Together ??

Once all the chunks are downloaded, we need to merge them back into a single file. We open a file stream and append the contents of each chunk file, then clean up the temporary files.

Future<void> mergeChunks(String filename, int totalChunks) async {
  String finalFilePath = await getFilePath(filename);
  File finalFile = File(finalFilePath);
  RandomAccessFile raf = await finalFile.open(mode: FileMode.write);

  for (int i = 0; i < totalChunks; i++) {
    String tempFilePath = await getFilePath('$filename.part$i');
    File tempFile = File(tempFilePath);

    if (await tempFile.exists()) {
      List<int> chunkBytes = await tempFile.readAsBytes();
      await raf.writeFrom(chunkBytes);

      // Clean up temporary chunk file after merging
      await tempFile.delete();
    }
  }

  await raf.close();
  print('File merged successfully: $finalFilePath');
}        

7. Putting It All Together ??

Let’s put all the pieces together and track the total download time:

Future<void> downloadFileInChunksParallel(String url, String filename) async {
  await checkAndRequestPermissions(); // Request storage permission

  String filePath = await getFilePath(filename);
  int fileSize = await getFileSize(url);
  int totalChunks = (fileSize / chunkSize).ceil(); // Total number of chunks

  Stopwatch stopwatch = Stopwatch(); // Track total download time
  stopwatch.start();

  // Download chunks in parallel
  List<Future<void>> downloadTasks = [];
  for (int i = 0; i < totalChunks; i++) {
    int start = i * chunkSize;
    int end = start + chunkSize - 1;
    if (end >= fileSize) {
      end = fileSize - 1;
    }
    downloadTasks.add(downloadChunk(url, filename, i, start, end));
  }

  await Future.wait(downloadTasks); // Wait for all tasks to complete

  stopwatch.stop();
  print('Total download time: ${stopwatch.elapsed.inSeconds} seconds');

  // Merge chunks
  await mergeChunks(filename, totalChunks);
}        

Why This Matters ??

  • Seamless Experience: Your users don’t have to worry about re-downloading large files when the network drops. The download resumes from the point where it stopped, saving time and frustration.
  • Bandwidth Efficiency: Only download what’s needed, without wasting extra data.
  • Performance Boost: Parallel downloading significantly speeds up the process.

What Next? ??

If you’re working on a project that requires file downloading in Flutter, especially with large files, give this approach a try. It’s scalable, efficient, and gives your users a smooth experience.

Let me know your thoughts or share how you handle downloads in your projects!

#flutterdev #flutter #mobiledevelopment #dio #async #dart #networking #resumabledownloads #parallelism

Gervais Youansi

Co-Founder Chat&Yamo & CEO Clean Code Academy | J'aide les développeurs juniors à atteindre rapidement le niveau de développeur senior.

1 个月

Interesting. Do you know what is the difference between Dio and the package http?

Bayard Kevin Ekwa Nde

Développeur .NET | DevOps | CloudOps | AZ-900

1 个月

The best. Good job

Samuel Bakon

Dev Full Stack | DevOps | Senior Symfony | Speaker | Training | Odoo | Business | Prompt Engineer | Ai tech ??

1 个月

Good job Lo?c NGOU il faut savoir que cette implémentation doit être complétée en symbiose avec les Ops. Car si les configurations adéquates ne sont pas faites, le travail présenté ici ne produira pas les bénéfices attendus.

David Roger Yannick Ngoue

Student in Artificial Intelligence and Big Data

1 个月

Thanks for sharing.????

要查看或添加评论,请登录

Lo?c NGOU的更多文章

社区洞察

其他会员也浏览了