Bandwidth ≠ Latency
Bandwidth and Latency are related but different concepts. I will explain it using the analogy of road traffic. As usual, the analogy is not perfect but should be utilized only to understand the concepts.
Bandwidth is the capacity at which a network can transmit data.
Latency refers to the delay that happens between when a user takes an action on a network or web application and when it reaches its destination.
Look at the picture below. Let's assume every second 10 bicyclists cross the finish line. So the bandwidth is 10 people/second.
However, it takes 1 hour to cover the 12 miles track. So you could say that the latency is 60 minutes.
Look at the picture below. The race track can only accommodate 10 cars, and only one car crosses the finish line every second. The bandwidth is 1 person/second.
However, these formula-one cars travel at 240 miles per hour and cover the same 12 miles in 3 minutes. So, the latency is 3 minutes.
In the example above, the bicycle race is high bandwidth, but latency is bad(more time). Whereas in the grand prix, the bandwidth is less, but latency is excellent (less time)
NOTE: These analogies are never perfect. Please take them with a pinch of salt. This discussion is just to understand the concept that bandwidth is not the same as latency.