Iterators & Generators
Hacker’s Infinite Conveyor Belt
Imagine you’re a hacker processing endless server logs. Loading them all into memory at once would crash your system. Instead, you want a conveyor belt that delivers one log at a time, only when needed. That’s the power of iterators and generators: they let you handle massive or infinite sequences efficiently, without overwhelming memory.
Iterators and generators embody lazy evaluation and they produce values only when requested. This chapter is about learning how to harness them for efficient loops and scalable data handling.
Why Iterators & Generators Matter
- Iterator: An object that implements the
__iter__()and__next__()methods, producing elements one at a time. - Generator: A special function that uses
yieldto produce values lazily. - Lazy Evaluation: Values are generated only when needed, saving memory.
- Efficiency: Perfect for large datasets, streams, or infinite sequences.
- Real‑World Analogy: Like a water tap - you don’t store all the water, you get it only when you open the tap.
Iterators in Action
numbers = [1, 2, 3]
iterator = iter(numbers)
print(next(iterator)) # 1
print(next(iterator)) # 2
print(next(iterator)) # 3
# print(next(iterator)) # Error: StopIteration
- Why? Iterators deliver one element at a time, raising
StopIterationwhen exhausted.
Custom Iterators
class Countdown:
def __init__(self, start):
self.start = start
def __iter__(self):
return self
def __next__(self):
if self.start <= 0:
raise StopIteration
self.start -= 1
return self.start
for num in Countdown(5):
print(num)
- Why? Custom iterators let you design sequences with specific logic.
Generators – The Lazy Hack
def countdown(n):
while n > 0:
yield n
n -= 1
for num in countdown(5):
print(num)
- Why? Generators simplify iterator creation with
yield, automatically handling__iter__and__next__.
Generator Expressions
squares = (x**2 for x in range(5))
for val in squares:
print(val)
- Why? Generator expressions are compact, memory‑efficient alternatives to list comprehensions.
Real‑World Example
def read_logs(filename):
with open(filename) as f:
for line in f:
yield line.strip()
for log in read_logs("server.log"):
print(log)
- Why? Generators let you process massive files line by line without loading them entirely into memory.
The Hacker’s Notebook
- Iterators deliver elements one at a time, using
__iter__and__next__. Generators simplify iteration withyield, enabling lazy evaluation. - Generator expressions provide compact, memory‑efficient loops. Lazy evaluation is ideal for large datasets, streams, or infinite sequences.
Hacker’s Mindset: treat iterators and generators as your conveyor belts. They deliver data on demand, keeping your systems efficient and scalable.

Updated on Jan 3, 2026