πŸ’» Computer Science Formulas

Master computer science with these formulas for algorithms, data structures, complexity, and more.

Advertisement

(Google AdSense will appear here)

πŸ“Œ Algorithms

Linear Search Time:

\( T(n) = O(n) \)

Binary Search Time:

\( T(n) = O(\log n) \)

Bubble Sort Time:

\( T(n) = O(n^2) \)

Insertion Sort Time:

\( T(n) = O(n^2) \)

Merge Sort Time:

\( T(n) = O(n \log n) \)

Quick Sort Average Time:

\( T(n) = O(n \log n) \)

Quick Sort Worst Time:

\( T(n) = O(n^2) \)

Euclidean GCD:

\( \gcd(a, b) = \gcd(b, a \mod b) \), \( O(\log \min(a, b)) \)

Exponentiation by Squaring:

\( a^n = (a^{n/2})^2 \) if even, \( a \cdot a^{n-1} \) if odd, \( O(\log n) \)

Factorial (Iterative):

\( n! = \prod_{k=1}^{n} k \), \( O(n) \)

Fibonacci (Iterative):

\( F_n = F_{n-1} + F_{n-2} \), \( O(n) \)

πŸ“Œ Data Structures

Array Access Time:

\( O(1) \)

Array Insertion (End):

\( O(1) \) (amortized with dynamic array)

Array Insertion (Middle):

\( O(n) \)

Linked List Access Time:

\( O(n) \)

Linked List Insertion (Known Position):

\( O(1) \)

Stack Push/Pop Time:

\( O(1) \)

Queue Enqueue/Dequeue Time:

\( O(1) \)

Binary Search Tree Search Time:

\( O(h) \) (where \( h \) is height, \( O(\log n) \) if balanced)

Heap Insert Time:

\( O(\log n) \)

Heap Extract-Min/Max Time:

\( O(\log n) \)

Hash Table Average Case Operations:

\( O(1) \) (assuming good hash function)

πŸ“Œ Complexity Analysis

Big-O Definition:

\( f(n) = O(g(n)) \) if \( \exists c, n_0 \) such that \( f(n) \leq c g(n) \) for \( n > n_0 \)

Big-Omega Definition:

\( f(n) = \Omega(g(n)) \) if \( \exists c, n_0 \) such that \( f(n) \geq c g(n) \) for \( n > n_0 \)

Big-Theta Definition:

\( f(n) = \Theta(g(n)) \) if \( f(n) = O(g(n)) \) and \( f(n) = \Omega(g(n)) \)

Master Theorem (Recurrence):

\( T(n) = a T(n/b) + f(n) \), then \( T(n) = O(n^{\log_b a}) \) if \( f(n) = O(n^c), c < \log_b a \)

Space Complexity (Array):

\( O(n) \)

Space Complexity (Recursion):

\( O(\log n) \) (balanced recursion depth)

Amortized Analysis (Dynamic Array):

\( O(1) \) per insertion over \( n \) operations

Time Complexity (Nested Loops):

\( O(n^k) \) for \( k \) nested loops

Comparison-Based Sorting Lower Bound:

\( \Omega(n \log n) \)

Number of Comparisons (Binary Search):

\( \lceil \log_2 (n + 1) \rceil \)

Recurrence for Divide-and-Conquer:

\( T(n) = 2 T(n/2) + O(n) = O(n \log n) \)

πŸ“Œ Boolean Logic

AND Operation:

\( A \land B \)

OR Operation:

\( A \lor B \)

NOT Operation:

\( \neg A \)

XOR Operation:

\( A \oplus B = (A \land \neg B) \lor (\neg A \land B) \)

Identity Laws:

\( A \land 1 = A \), \( A \lor 0 = A \)

Complement Laws:

\( A \land \neg A = 0 \), \( A \lor \neg A = 1 \)

De Morgan’s Laws:

\( \neg (A \land B) = \neg A \lor \neg B \)

\( \neg (A \lor B) = \neg A \land \neg B \)

Distributive Laws:

\( A \land (B \lor C) = (A \land B) \lor (A \land C) \)

\( A \lor (B \land C) = (A \lor B) \land (A \lor C) \)

Truth Table Size:

\( 2^n \) rows for \( n \) variables

πŸ“Œ Recursion

Factorial (Recursive):

\( n! = n \cdot (n-1)! \), \( 0! = 1 \), \( O(n) \)

Fibonacci (Recursive):

\( F_n = F_{n-1} + F_{n-2} \), \( O(2^n) \) without memoization

Tower of Hanoi:

\( T(n) = 2 T(n-1) + 1 \), \( T(n) = 2^n - 1 \)

Recursive Binary Search:

\( T(n) = T(n/2) + O(1) = O(\log n) \)

Memoized Fibonacci Time:

\( O(n) \) with dynamic programming

Space Complexity (Tail Recursion):

\( O(1) \) (optimized by compiler)

Depth of Recursion:

\( d = \log_2 n \) (for balanced binary recursion)

Ackermann Function:

\( A(m, n) = n + 1 \) if \( m = 0 \), \( A(m-1, 1) \) if \( n = 0 \), else \( A(m-1, A(m, n-1)) \)

Combination (Recursive):

\( C(n, k) = C(n-1, k-1) + C(n-1, k) \), \( O(2^n) \) without optimization

Permutation (Recursive):

\( P(n, k) = n \cdot P(n-1, k-1) \), \( O(n) \)

πŸ“Œ Graph Algorithms

Number of Edges (Complete Graph):

\( |E| = \frac{n (n-1)}{2} \) for \( K_n \)

DFS Time Complexity:

\( O(V + E) \)

BFS Time Complexity:

\( O(V + E) \)

Dijkstra’s Algorithm Time:

\( O((V + E) \log V) \) with binary heap

Bellman-Ford Time:

\( O(V E) \)

Floyd-Warshall Time:

\( O(V^3) \)

Prim’s Algorithm Time:

\( O(E \log V) \) with binary heap

Kruskal’s Algorithm Time:

\( O(E \log E) \)

Topological Sort Time:

\( O(V + E) \)

Max Flow (Ford-Fulkerson):

\( O(E f) \) (where \( f \) is max flow value)

Euler Path Condition:

At most 2 vertices with odd degree

πŸ“Œ Probability in Computing

Probability of Event:

\( P(A) = \frac{|A|}{|S|} \) (uniform sample space)

Expected Value:

\( E(X) = \sum x_i P(X = x_i) \)

Variance:

\( Var(X) = E(X^2) - [E(X)]^2 \)

Binomial Probability:

\( P(X = k) = \binom{n}{k} p^k (1-p)^{n-k} \)

Binomial Expected Value:

\( E(X) = n p \)

Geometric Probability:

\( P(X = k) = (1-p)^{k-1} p \)

Geometric Expected Value:

\( E(X) = \frac{1}{p} \)

Hash Collision Probability (Birthday):

\( P(\text{collision}) \approx 1 - e^{-\frac{k^2}{2n}} \) (where \( k \) is items, \( n \) is slots)

Load Factor (Hash Table):

\( \alpha = \frac{n}{m} \) (items/slots)

Markov Chain Transition:

\( P(X_{t+1} = j | X_t = i) = p_{ij} \)

Steady-State Probability:

\( \pi P = \pi \) (where \( P \) is transition matrix)

πŸ“Œ Cryptography

Modular Exponentiation:

\( c = m^e \mod n \), \( O(\log e) \) with fast exponentiation

RSA Encryption:

\( c = m^e \mod n \) (public key \( (e, n) \))

RSA Decryption:

\( m = c^d \mod n \) (private key \( d \))

RSA Key Condition:

\( e \cdot d \equiv 1 \pmod{\phi(n)} \), \( \phi(n) = (p-1)(q-1) \)

Fermat’s Little Theorem:

\( a^{p-1} \equiv 1 \pmod{p} \) (prime \( p \), \( p \nmid a \))

Euler’s Theorem:

\( a^{\phi(n)} \equiv 1 \pmod{n} \) (\( \gcd(a, n) = 1 \))

Chinese Remainder Theorem:

\( x \equiv a_i \pmod{m_i} \) has unique solution \( \pmod{\prod m_i} \) if \( m_i \) coprime

Diffie-Hellman Key Exchange:

\( K = g^{ab} \mod p \) (shared secret)

Hash Function Output Size:

\( 2^k \) possible outputs for \( k \)-bit hash

Birthday Attack Probability:

\( P(\text{collision}) \approx 1 - e^{-\frac{n^2}{2 \cdot 2^k}} \) (for \( n \) inputs)

πŸ“Œ Information Theory

Shannon Entropy:

\( H(X) = -\sum p(x_i) \log_2 p(x_i) \)

Binary Entropy:

\( H(p) = -p \log_2 p - (1-p) \log_2 (1-p) \)

Mutual Information:

\( I(X; Y) = H(X) + H(Y) - H(X, Y) \)

Channel Capacity (Binary Symmetric):

\( C = 1 - H(p) \) (where \( p \) is error probability)

Kullback-Leibler Divergence:

\( D_{KL}(P || Q) = \sum p(x) \log_2 \frac{p(x)}{q(x)} \)

Information Content:

\( I(x) = -\log_2 p(x) \)

Compression Ratio:

\( CR = \frac{\text{Original Size}}{\text{Compressed Size}} \)

Huffman Coding Average Length:

\( L = \sum p_i l_i \) (where \( l_i \) is code length)

Entropy Rate:

\( h = \lim_{n \to \infty} \frac{H(X_1, X_2, \ldots, X_n)}{n} \)

Bit Rate:

\( R = f_s \cdot b \) (samples per second \( f_s \), bits per sample \( b \))

Advertisement

(Google AdSense will appear here)