Time and space complexity
Ahmad Uzair
AI/ML Engineer | Generative AI & Computer Vision Enthusiast | Deep Learning, NLP, and LLMs | Python, TensorFlow, PyTorch | Educator & Researcher
We solve problems with multiple logic, every logic expresses algorithms. Who knows which logic or algorithm is better? we calculate it in different metrics but the important metric is time. how many times is your logic required to solve a problem?
Definition: Time complexity measures the amount of time an algorithm takes to complete based on the size of the input.
Example: Imagine you're sorting a deck of cards. If you're using a slow sorting method, like looking for the smallest card repeatedly and putting it in the right place, it will take more time as the number of cards increases.
Common time complexities:
O(1): Constant time complexity. The algorithm's execution time remains constant, regardless of the input size. As we know number of executions.
O(n): Linear time complexity. The execution time grows linearly with the input size. In we irritate n numbers
领英推荐
O(log n): Logarithmic time complexity. Common in algorithms with divide-and-conquer strategies like binary search.
O(n^2): Quadratic time complexity. Common in nested loops where the algorithm processes each pair of elements.
O(2^n): Exponential time complexity. Often associated with algorithms with recursive solutions that generate an exponential number of subproblems.
Space Complexity: Space complexity measures the amount of memory an algorithm uses in relation to the size of the input. It helps us understand how the memory requirements of an algorithm grow with an increase in the input size. Similar to time complexity, space complexity is expressed using big O notation.
Common space complexities: