When you first pick up a language that embraces functional ideas, loops can feel odd. In Elixir there are no while or for statements that mutate a counter. Instead, iteration is expressed through recursion and a set of powerful higher‑order functions. This article walks you through the core concepts, shows how they’re applied in everyday code, and equips you with fresh examples you can adapt to your own projects.
Why Understanding Elixir’s Iteration Model Matters
- Predictable data flow: Since values never change in place, reasoning about code becomes easier.
- Composability: Functions that accept other functions can be chained, producing succinct pipelines.
- Scalability: Lazy data structures like
Streamlet you work with massive inputs (e.g., huge log files) without blowing up memory.
1. The Recursion Primer
Recursion is the foundational looping construct in Elixir. A function calls itself with a simpler argument until a base case is reached.
defmodule Inventory do
# Returns the total quantity of items in a stock list.
# Example input: [{:apple, 4}, {:banana, 7}, {:orange, 2}]
def total_quantity([]), do: 0
def total_quantity([{_product, qty} | tail]) do
qty + total_quantity(tail)
end
endNotice how the function head itself is a pattern. The first clause matches an empty list (the base case), while the second clause extracts the head of the list and recurses on the tail.
Tail‑recursion for Efficiency
When the recursive call is the last operation in a function, the compiler can reuse the same stack frame, preventing overflow for very long inputs. The classic pattern is to carry an “accumulator” argument that holds the intermediate result.
defmodule Inventory do
def total_quantity(list), do: total_quantity(list, 0)
defp total_quantity([], acc), do: acc
defp total_quantity([{_product, qty} | tail], acc) do
total_quantity(tail, acc + qty)
end
endThe second clause now returns the result of the recursive call directly, making the function tail‑recursive.
2. Higher‑Order Functions: Let the Library Do the Looping
Writing raw recursion for every collection‑manipulation quickly becomes tedious. The Enum module provides a toolbox of functions that accept other functions as arguments. These are called higher‑order functions because they operate on function values.
2.1 Mapping – Transforming One Collection into Another
Imagine you have a list of usernames and need to turn each into a greeting string.
users = ["alice", "bob", "charlie"]
greetings = Enum.map(users, fn name -> "Welcome, #{String.capitalize(name)}!" end)
# → ["Welcome, Alice!", "Welcome, Bob!", "Welcome, Charlie!"]
Using the capture operator (&) makes the expression tighter:
greetings = Enum.map(users, &("Welcome, #{String.capitalize(&1)}!"))
2.2 Filtering – Picking Out What You Need
Suppose you only want the orders whose total exceeds a threshold.
orders = [
%{id: 1, total: 45.00},
%{id: 2, total: 12.99},
%{id: 3, total: 78.50}
]
big_orders = Enum.filter(orders, fn %{total: t} -> t > 30 end)
# → [%{id: 1, total: 45.0}, %{id: 3, total: 78.5}]
2.3 Reducing – Folding an Enumerable into a Single Value
The reduce/3 function (also known as fold or inject) expresses a “walk‑through‑with‑an‑accumulator” pattern. Let’s compute the total revenue from the orders list above.
revenue = Enum.reduce(orders, 0.0, fn %{total: t}, acc -> acc + t end)
# → 136.49
Because the + operator is itself a function, you can shorten the call:
revenue = Enum.reduce(orders, 0.0, &+/2)
2.4 Combining Multiple Operations
Thanks to the pipe operator (|>), you can chain transformations in a readable left‑to‑right flow.
high_value_customers =
orders
|> Enum.filter(fn %{total: t} -> t > 20 end)
|> Enum.map(fn %{id: id} -> "Customer #{id}" end)
3. Comprehensions – A Concise Alternative to Enum.map
Comprehensions are syntactic sugar for simple maps, filters, and even cartesian products. Their syntax resembles that of list comprehensions in Python or Haskell.
3.1 Simple Mapping
squared = for n <- 1..5, do: n * n
# → [1, 4, 9, 16, 25]
3.2 Filtering Inside a Comprehension
even_squares = for n <- 1..10, rem(n, 2) == 0, do: n * n
# → [4, 16, 36, 64, 100]
3.3 Generating a Cartesian Product
Let’s build a small schedule matrix for a conference.
times = [:morning, :afternoon]
rooms = [:alpha, :beta, :gamma]
schedule = for time <- times, room <- rooms, do: {time, room}
# → [{:morning, :alpha}, {:morning, :beta}, {:morning, :gamma},
# {:afternoon, :alpha}, {:afternoon, :beta}, {:afternoon, :gamma}]
3.4 Collecting into Structures Other Than Lists
You can tell the comprehension where to store its results via the into: option.
price_map =
for {item, price} <- [{"apple", 1.2}, {"bread", 2.5}, {"milk", 1.8}],
into: %{} do
{String.to_atom(item), price}
end
# → %{apple: 1.2, bread: 2.5, milk: 1.8}
4. Streams – Lazy, Composable Pipelines
When you use Enum functions, the whole collection is pulled into memory and processed eagerly. Stream transforms an enumerable into a lazy enumerable, meaning that each step produces values only when they’re asked for.
4.1 Building a Lazy Transformation
lazy_doubles = Stream.map(1..100_000, fn n -> n * 2 end)
# No computation happens yet.
To force evaluation, pass the stream to an Enum function:
first_ten = lazy_doubles |> Enum.take(10)
# → [2, 4, 6, 8, 10, 12, 14, 16, 18, 20]
4.2 One‑Pass Workflows with Streams
Suppose you have a massive CSV file containing product sales. You want to extract the total revenue for rows where the quantity is positive, but you don’t want to load the whole file into memory.
defmodule Sales do
def total_revenue(file_path) do
File.stream!(file_path)
|> Stream.map(&String.trim/1)
|> Stream.filter(&String.contains?(&1, ","))
|> Stream.map(fn line ->
[_, qty_str, price_str] = String.split(line, ",")
{String.to_integer(qty_str), String.to_float(price_str)}
end)
|> Stream.filter(fn {qty, _price} -> qty > 0 end)
|> Enum.reduce(0.0, fn {qty, price}, acc -> acc + qty * price end)
end
end
All the heavy lifting (reading, parsing, filtering) occurs lazily, and the final Enum.reduce/3 traverses the stream just once, keeping a tiny amount of memory.
4.3 Combining Enumerables and Streams
Sometimes you start with an eager Enum value and later want lazy operations. You can turn a list into a stream with Stream.iterate/2 or Stream.unfold/2. Here’s a quick infinite Fibonacci generator, limited to the first 15 numbers:
fib_stream =
Stream.unfold({0, 1}, fn {a, b} -> {a, {b, a + b}} end)
first_15 = fib_stream |> Enum.take(15)
# → [0, 1, 1, 2, 3, 5, 8, 13, 21, 34, 55, 89, 144, 233, 377]
5. Putting It All Together – A Mini Project
Let’s design a tiny “scoreboard” that reads a text file where each line is "player_name,score". We’ll compute the top three players without loading the whole file into memory.
defmodule Scoreboard do
@doc """
Returns the three highest‑scoring players from a CSV‑style file.
"""
def top_three(path) do
path
|> File.stream!()
|> Stream.map(&String.trim/1)
|> Stream.filter(&String.contains?(&1, ","))
|> Stream.map(fn line ->
[name, score_str] = String.split(line, ",")
{name, String.to_integer(score_str)}
end)
|> Enum.sort_by(fn {_name, score} -> -score end)
|> Enum.take(3)
end
end
Key takeaways from this snippet:
- File.stream!/1 yields a lazy enumerable of lines.
- Each transformation (trim, filter, parse) is expressed as a
Streamstep. - The final
Enum.sort_by/2forces materialization, but only after the stream has been fully traversed—still a single pass.
6. Common Patterns and Idioms
- Pattern matching in function heads – Use it to destructure data rather than calling
Map.get/2inside the body. - Guard clauses – Keep your functions safe:
defp add_num(num, sum) when is_number(num), do: sum + num. - Modular pipelines – Break complex pipelines into private helper functions for readability.
- Collectables – Anything that implements the
Collectableprotocol can be the target offor … into:orEnum.into/2.
7. Pitfalls to Watch Out For
- Accidentally creating non‑tail‑recursive functions – If you perform work after the recursive call (e.g., concatenating lists), you lose the tail‑call optimization.
- Mixing eager
Enumwith lazyStreaminside the same pipeline – Once anEnumfunction is called, the whole upstream stream is realized. Keep the lazy steps together, then finish with a singleEnumconsumer. - Using capture operator incorrectly –
&String.length/1works, but&String.length/2does not exist; be aware of arity. - Assuming order preservation in maps – In Elixir 1.10+, maps retain insertion order, but relying on it for algorithmic logic is fragile; use
Keywordlists orEnumsorting when order matters.
8. Summary – The Core Takeaways
- Pattern matching is the primary way functions receive and dissect data.
- Functions can have multiple clauses; the first matching clause wins.
- Recursion (especially tail‑recursion) is the native looping mechanism.
- Higher‑order functions like
Enum.map/2,Enum.filter/2, andEnum.reduce/3let you express loops without explicit recursion. - Comprehensions provide a succinct syntax for mapping, filtering, and Cartesian products.
- Streams enable lazy, composable pipelines that process data on demand, saving memory and reducing passes over collections.
- Combining these tools leads to clear, expressive code that scales from tiny lists to huge files.
With these building blocks in your toolbox, you can approach any data‑processing task in Elixir confidently—whether you’re counting inventory items, generating reports from logs, or building real‑time dashboards—all while staying true to the functional spirit of the language.