DEV Community

Cover image for Learning Elixir: Function Composition
João Paulo Abreu
João Paulo Abreu

Posted on

Learning Elixir: Function Composition

Function composition is like building with LEGO blocks - you create small, simple pieces that can be connected together to build something bigger. In Elixir, instead of writing one huge function that does everything, we write small functions that do one thing well, then combine them. This makes our code easier to understand, test, and reuse. In this article, we'll start with simple examples and gradually explore different ways to combine functions in Elixir.

Note: The examples in this article use Elixir 1.18.3. While most operations should work across different versions, some functionality might vary.

Table of Contents

Introduction

Function composition is about building complex functionality from simple, composable parts. Rather than writing large functions that do many things, we write small functions that do one thing well and combine them to achieve our goals.

If you're familiar with the pipe operator (|>), function composition is a related concept but goes a step further - instead of just chaining function calls, we create entirely new functions by combining existing ones.

Understanding Function Composition

Why Use Function Composition?

Before we dive into the "how", let's understand the "why". Imagine you're making a sandwich:

  1. You could write one big function that does everything (get bread, add mayo, add lettuce, add tomato, etc.)
  2. Or you could write small functions for each step and combine them

The second approach is function composition, and it's better because:

  • Each function is simple and easy to test
  • You can reuse functions (maybe use "add mayo" in other recipes)
  • It's easier to understand what each part does
  • You can easily change or rearrange steps

What is Function Composition?

Think of function composition like a factory assembly line: one function takes an input and produces an output, which becomes the input for the next function. For example, if we have a function that doubles a number and another that adds 1, we can combine them to create a new function that does both operations in sequence.

In Elixir, we can achieve this in several ways:

defmodule CompositionBasics do
  def double(x), do: x * 2
  def increment(x), do: x + 1

  # Manual composition
  def double_then_increment(x) do
    increment(double(x))
  end

  # Using the pipe operator
  def double_then_increment_piped(x) do
    x |> double() |> increment()
  end

  # Creating a new function by combining two existing functions
  # This might look complex at first, but it's just saying:
  # "make a new function that first applies g, then applies f"
  # The dot notation (f.) is used to call anonymous functions
  def compose(f, g) do
    fn x -> f.(g.(x)) end
  end
end
Enter fullscreen mode Exit fullscreen mode

Testing in IEx:

iex> CompositionBasics.double_then_increment(5)
11

iex> CompositionBasics.double_then_increment_piped(5)
11

iex> # Create a new function by composing increment and double
iex> composed = CompositionBasics.compose(&CompositionBasics.increment/1, &CompositionBasics.double/1)
#Function<...>

iex> # Now we can use our composed function
iex> composed.(5)  # Same as: increment(double(5)) = increment(10) = 11
11
Enter fullscreen mode Exit fullscreen mode

Basic Composition Techniques

Using Anonymous Functions

Let's start with the simplest form of composition. Anonymous functions (functions without names) are perfect for creating small, one-time-use functions:

defmodule SimpleComposition do
  def add(x, y), do: x + y
  def multiply(x, y), do: x * y

  # Create a function that combines two operations:
  # 1. First add 10
  # 2. Then multiply by 2
  # So compose_math_ops().(5) would be: (5 + 10) * 2 = 30
  def compose_math_ops do
    fn x ->
      x
      |> add(10)
      |> multiply(2)
    end
  end

  # Create a pipeline that applies a list of operations in order
  # Each operation is applied to the result of the previous one
  def create_pipeline(operations) do
    fn initial_value ->
      # Start with initial_value, apply each operation in sequence
      Enum.reduce(operations, initial_value, fn operation, acc ->
        operation.(acc)
      end)
    end
  end
end
Enter fullscreen mode Exit fullscreen mode

Testing in IEx:

iex> math_pipeline = SimpleComposition.compose_math_ops()
#Function<...>

iex> math_pipeline.(5)
30

iex> operations = [
       &(&1 + 5),
       &(&1 * 2),
       &(&1 - 3)
]
[#Function<...>, #Function<...>, #Function<...>]

iex> pipeline = SimpleComposition.create_pipeline(operations)
#Function<...>

iex> pipeline.(10)
27
Enter fullscreen mode Exit fullscreen mode

Function Capture and Composition

The capture operator (&) is Elixir's shorthand for creating anonymous functions. It might look strange at first, but it's just a concise way to reference functions:

defmodule CaptureComposition do
  def process_string(string) do
    transformations = [
      &String.trim/1,
      &String.downcase/1,
      &String.replace(&1, " ", "_"),  # &1 refers to the first argument
      &("prefix_" <> &1)                 # <> is the string concatenation operator
    ]

    Enum.reduce(transformations, string, fn f, acc -> f.(acc) end)
  end

  # Composing with different arities
  def compose_validators(validators) do
    fn value ->
      # reduce_while stops iteration when it receives {:halt, value}
      Enum.reduce_while(validators, {:ok, value}, fn validator, {:ok, val} ->
        case validator.(val) do
          {:ok, _} = result -> {:cont, result}
          {:error, _} = error -> {:halt, error}
        end
      end)
    end
  end
end
Enter fullscreen mode Exit fullscreen mode

Testing in IEx:

iex> CaptureComposition.process_string("  Hello World  ")
"prefix_hello_world"

iex> validators = [
  fn x -> if is_binary(x), do: {:ok, x}, else: {:error, "not a string"} end,
  fn x -> if String.length(x) > 2, do: {:ok, x}, else: {:error, "too short"} end,
  fn x -> if String.match?(x, ~r/^[a-z]+$/), do: {:ok, x}, else: {:error, "invalid chars"} end
]

iex> validator = CaptureComposition.compose_validators(validators)
#Function<...>

iex> validator.("hello")
{:ok, "hello"}

iex> validator.("hi")
{:error, "too short"}

iex> validator.("Hello")
{:error, "invalid chars"}
Enter fullscreen mode Exit fullscreen mode

Higher-Order Functions

A higher-order function is simply a function that either takes other functions as arguments or returns a function. Don't let the name intimidate you - you've probably already used them with Enum.map!

defmodule HigherOrder do
  # Function that returns a multiplier function
  def multiplier(factor) do
    fn x -> x * factor end
  end

  # Function that returns a filter function
  def create_filter(predicate) do
    fn list -> Enum.filter(list, predicate) end
  end

  # Function that composes two unary functions
  def compose(f, g) do
    fn x -> f.(g.(x)) end
  end

  # Composes multiple functions in a right-to-left sequence
  # This follows the standard mathematical definition: (f ∘ g ∘ h)(x) = f(g(h(x)))
  # For a list like [f, g, h], the rightmost function (h) is applied first
  # We reverse the list so Enum.reduce applies functions in correct order: h, then g, then f
  def compose_all(functions) do
    fn x ->
      functions
      |> Enum.reverse()
      |> Enum.reduce(x, fn f, acc -> f.(acc) end)
    end
  end

  # Composition for functions that return {:ok, value} or {:error, reason}
  def compose_result(f, g) do
    fn x ->
      case g.(x) do
        {:ok, result} -> f.(result)
        error -> error
      end
    end
  end
end
Enter fullscreen mode Exit fullscreen mode

Testing in IEx:

iex> times_three = HigherOrder.multiplier(3)
#Function<...>

iex> times_three.(7)
21

iex> positive_filter = HigherOrder.create_filter(&(&1 > 0))
#Function<...>

iex> positive_filter.([-2, -1, 0, 1, 2, 3])
[1, 2, 3]

iex> add_one = &(&1 + 1)
#Function<...>

iex> double = &(&1 * 2)
#Function<...>

iex> composed = HigherOrder.compose(add_one, double)
#Function<...>

iex> composed.(5)
11

# Note: this is right-to-left composition: add_one(double(sub_3(10))) = add_one(double(7)) = add_one(14) = 15
iex> pipeline = HigherOrder.compose_all([add_one, double, &(&1 - 3)])
#Function<...>

iex> pipeline.(10)
15
Enter fullscreen mode Exit fullscreen mode

Composition Order: Pipeline vs. Mathematical

You might have noticed that SimpleComposition.create_pipeline and HigherOrder.compose_all compose a list of functions in different orders:

  • create_pipeline([f, g, h]) creates a left-to-right pipeline: h(g(f(x))). This is intuitive and reads like the pipe operator.
  • compose_all([f, g, h]) creates a right-to-left composition: f(g(h(x))). This follows the traditional mathematical definition of function composition.

Both patterns are useful. The key is to be consistent and understand which order your composition function uses.

Partial Application and Currying

Sometimes we want to "pre-fill" some arguments of a function and create a new function with fewer arguments. This is called partial application. Currying is a related concept where a function that takes multiple arguments is transformed into a series of functions that each take one argument:

defmodule PartialApplication do
  # Manual currying
  def curry_add do
    # Returns a function that returns another function
    fn a -> fn b -> a + b end end
  end

  # Generic curry for 2-arity functions
  # Transforms a function that takes 2 arguments into nested functions
  def curry2(fun) do
    fn a -> fn b -> fun.(a, b) end end
  end

  # Partial application using closures
  def partial(fun, arg1) do
    fn arg2 -> fun.(arg1, arg2) end
  end

  # More practical example
  def configure_logger(level) do
    fn module, message ->
      IO.puts("[#{level}] #{module}: #{message}")
    end
  end

  # Create a validator that checks data against a set of rules
  # This is a practical example of creating customized functions
  def create_validator(rules) do
    fn data ->
      # Check each field against its rule
      # Pattern match on function arguments directly in the fn
      Enum.reduce(rules, {:ok, data}, fn
        {field, rule}, {:ok, data} ->
          case Map.get(data, field) do
            nil -> {:error, "#{field} is required"}
            value ->
              if rule.(value), do: {:ok, data}, else: {:error, "#{field} is invalid"}
          end
        _, error -> error  # If there's already an error, pass it through
      end)
    end
  end
end
Enter fullscreen mode Exit fullscreen mode

Testing in IEx:

iex> curried_add = PartialApplication.curry_add()
#Function<...>

iex> add_five = curried_add.(5)
#Function<...>

iex> add_five.(3)
8

iex> info_logger = PartialApplication.configure_logger("INFO")
#Function<...>

iex> info_logger.("UserModule", "User logged in")
[INFO] UserModule: User logged in
:ok

iex> rules = %{
  name: &(String.length(&1) > 2),
  age: &(&1 >= 18)
}

iex> validator = PartialApplication.create_validator(rules)
#Function<...>

iex> validator.(%{name: "Alice", age: 25})
{:ok, %{age: 25, name: "Alice"}}

iex> validator.(%{name: "Bo", age: 20})
{:error, "name is invalid"}
Enter fullscreen mode Exit fullscreen mode

Working with Results That Can Fail

In Elixir, it's common to use {:ok, value} and {:error, reason} tuples to represent operations that might fail. Here's how to compose functions that work with these patterns:

defmodule MonadicComposition do
  # Compose functions that return {:ok, value} or {:error, reason}
  # This pattern is common in Elixir for error handling

  # bind: if we have a successful result, apply the next function
  # if we have an error, skip the function and pass the error along
  def bind(result, fun) do
    case result do
      {:ok, value} -> fun.(value)
      error -> error
    end
  end

  # Compose multiple monadic functions
  def pipeline(initial, functions) do
    # Apply each function to the result, stopping on first error
    Enum.reduce(functions, {:ok, initial}, fn fun, acc ->
      bind(acc, fun)
    end)
  end

  # Example functions that might fail
  def parse_int(string) do
    case Integer.parse(string) do
      {int, ""} -> {:ok, int}  # Empty string means all input was parsed
      _ -> {:error, "Invalid integer"}
    end
  end

  def divide_by(number, divisor) do
    if divisor == 0 do
      {:error, "Division by zero"}
    else
      {:ok, number / divisor}
    end
  end

  def ensure_positive(number) do
    if number > 0 do
      {:ok, number}
    else
      {:error, "Number must be positive"}
    end
  end
end
Enter fullscreen mode Exit fullscreen mode

Testing in IEx:

iex> # Using the MonadicComposition module functions

iex> {:ok, 10} |> MonadicComposition.bind(&MonadicComposition.divide_by(&1, 2)) |> MonadicComposition.bind(&MonadicComposition.ensure_positive/1)
{:ok, 5.0}

iex> {:ok, 10} |> MonadicComposition.bind(&MonadicComposition.divide_by(&1, 0)) |> MonadicComposition.bind(&MonadicComposition.ensure_positive/1)
{:error, "Division by zero"}

iex> MonadicComposition.pipeline("42", [
  &MonadicComposition.parse_int/1,
  &MonadicComposition.divide_by(&1, 2),
  &MonadicComposition.ensure_positive/1
])
{:ok, 21.0}

iex> MonadicComposition.pipeline("-10", [
  &MonadicComposition.parse_int/1,
  &MonadicComposition.ensure_positive/1
])
{:error, "Number must be positive"}
Enter fullscreen mode Exit fullscreen mode

Real-World Examples

Data Processing Pipeline

defmodule DataPipeline do
  # Individual processing steps
  def normalize_text(text) do
    text
    |> String.trim()
    |> String.downcase()
  end

  def extract_words(text) do
    # ~r creates a regular expression, \W+ matches non-word characters
    String.split(text, ~r/\W+/, trim: true)
  end

  def filter_stopwords(words) do
    # ~w creates a list of words (strings)
    stopwords = ~w(the a an and or but in on at to for)
    Enum.reject(words, &(&1 in stopwords))
  end

  def count_frequencies(words) do
    Enum.frequencies(words)
  end

  # Compose a text analysis pipeline
  def analyze_text do
    fn text ->
      text
      |> normalize_text()
      |> extract_words()
      |> filter_stopwords()
      |> count_frequencies()
    end
  end

  # More complex: compose with configuration
  def create_analyzer(options \\ []) do
    # Keyword.get retrieves values from keyword lists with defaults
    min_length = Keyword.get(options, :min_word_length, 3)
    top_n = Keyword.get(options, :top_n, 10)

    fn text ->
      text
      |> normalize_text()
      |> extract_words()
      |> filter_stopwords()
      |> Enum.filter(&(String.length(&1) >= min_length))
      |> count_frequencies()
      # _ ignores the word, we only care about count for sorting
      |> Enum.sort_by(fn {_, count} -> count end, :desc)
      |> Enum.take(top_n)
    end
  end
end
Enter fullscreen mode Exit fullscreen mode

Testing in IEx:

iex> text = "The quick brown fox jumps over the lazy dog. The dog was very lazy."

iex> analyzer = DataPipeline.analyze_text()
#Function<...>

iex> analyzer.(text)
%{
  "brown" => 1,
  "dog" => 2,
  "fox" => 1,
  "jumps" => 1,
  "lazy" => 2,
  "over" => 1,
  "quick" => 1,
  "very" => 1,
  "was" => 1
}

iex> custom_analyzer = DataPipeline.create_analyzer(min_word_length: 4, top_n: 5)
#Function<...>

iex> custom_analyzer.(text)
[
  {"lazy", 2},
  {"brown", 1},
  {"jumps", 1},
  {"over", 1},
  {"quick", 1}
]
Enter fullscreen mode Exit fullscreen mode

Middleware Composition

defmodule Middleware do
  # Middleware pattern: each function transforms the conn (connection)
  def compose_middleware(middlewares) do
    fn conn ->
      Enum.reduce(middlewares, conn, fn middleware, acc ->
        middleware.(acc)
      end)
    end
  end

  # Example middlewares
  def logging_middleware(conn) do
    IO.puts("Request: #{conn.method} #{conn.path}")
    conn
  end

  def auth_middleware(conn) do
    if conn.headers["authorization"] do
      Map.put(conn, :authenticated, true)
    else
      conn
      |> Map.put(:status, 401)
      |> Map.put(:halted, true)
    end
  end

  def timing_middleware(conn) do
    # monotonic_time gives a timestamp for measuring elapsed time
    start_time = System.monotonic_time(:millisecond)
    Map.put(conn, :timing_start, start_time)
  end

  # Create a request pipeline
  def create_pipeline do
    compose_middleware([
      &logging_middleware/1,  # & captures a function reference, /1 means arity 1
      &timing_middleware/1,
      &auth_middleware/1
    ])
  end
end
Enter fullscreen mode Exit fullscreen mode

Testing in IEx:

iex> conn = %{method: "GET", path: "/users", headers: %{}, status: 200, halted: false}

iex> pipeline = Middleware.create_pipeline()
#Function<...>

iex> pipeline.(conn)
Request: GET /users
%{
  halted: true,
  headers: %{},
  method: "GET",
  path: "/users",
  status: 401,
  timing_start: -576460751779  # Actual value will vary
}

iex> # The | operator updates specific fields in a map, keeping other fields unchanged
iex> auth_conn = %{conn | headers: %{"authorization" => "Bearer token"}}

iex> pipeline.(auth_conn)
Request: GET /users
%{
  authenticated: true,
  halted: false,
  headers: %{"authorization" => "Bearer token"},
  method: "GET",
  path: "/users",
  status: 200,
  timing_start: -576460751777  # Actual value will vary
}
Enter fullscreen mode Exit fullscreen mode

Best Practices

1. Keep Functions Focused

Each function should do one thing well:

# Good: Each function has a single responsibility
def trim_whitespace(string), do: String.trim(string)
def lowercase(string), do: String.downcase(string)
def replace_spaces(string), do: String.replace(string, " ", "_")

# Less good: Function does too many things
def process_string(string) do
  string
  |> String.trim()
  |> String.downcase()
  |> String.replace(" ", "_")
end
Enter fullscreen mode Exit fullscreen mode

2. Make Functions Composable

Design functions to work well in composition:

# Good: Pure function without side effects
def add_prefix(string, prefix), do: prefix <> string

# Less composable: Side effects break composition
def add_prefix_and_print(string, prefix) do
  result = prefix <> string
  IO.puts(result)
  result
end
Enter fullscreen mode Exit fullscreen mode

3. Use Consistent Return Types

Functions that might fail should use consistent patterns:

# Good: Consistent error handling
def safe_divide(a, 0), do: {:error, "Division by zero"}
def safe_divide(a, b), do: {:ok, a / b}

def safe_sqrt(n) when n < 0, do: {:error, "Cannot take square root of negative"}
def safe_sqrt(n), do: {:ok, :math.sqrt(n)}

# Now they compose well
def calculate(a, b) do
  # 'with' chains operations that might fail
  # If any step returns an error, it short-circuits
  with {:ok, divided} <- safe_divide(a, b),
       {:ok, root} <- safe_sqrt(divided) do
    {:ok, root}
  end
end
Enter fullscreen mode Exit fullscreen mode

4. Document Composition Intent

Make it clear how functions are meant to be composed:

defmodule Pipeline do
  # @moduledoc adds documentation to the module
  @moduledoc """
  Functions designed to be composed in a processing pipeline.
  Each function takes and returns a map with :data and :metadata keys.
  """

  def validate(%{data: data} = input) do
    # validation logic
    input
  end

  def transform(%{data: data} = input) do
    # transformation logic
    input
  end

  def enrich(%{data: data} = input) do
    # enrichment logic
    input
  end
end
Enter fullscreen mode Exit fullscreen mode

Conclusion

Function composition is a fundamental technique in functional programming that enables you to build complex functionality from simple, reusable parts. In Elixir, we have multiple tools and patterns for composition:

  • Basic composition using anonymous functions and the pipe operator
  • Higher-order functions that return new functions
  • Partial application and currying techniques
  • Advanced patterns for error handling with {:ok, value} tuples
  • Real-world applications in data processing and middleware

Key takeaways:

  • Compose small, focused functions rather than writing large, complex ones
  • Use consistent return types to make composition easier
  • Leverage Elixir's pipe operator for readable left-to-right composition
  • Consider higher-order functions when you need to parameterize behavior
  • Apply composition patterns to solve real problems like data pipelines and middleware

By mastering function composition, you'll write more modular, testable, and maintainable Elixir code.

Further Reading

Next Steps

In the upcoming article, we'll begin exploring Advanced Pattern Matching:

Pattern Matching Fundamentals

  • Deep dive into pattern matching mechanics
  • Pattern matching in different contexts (case, cond, with)
  • Advanced matching techniques
  • Common patterns and anti-patterns

Pattern matching is one of Elixir's most distinctive features, and mastering it will unlock new ways to write expressive, efficient code.

Top comments (2)

Collapse
 
stefano1990 profile image
Stefano1990

Thank you for your article. I enjoyed reading it. However, I think all of them would work better with an argument and do operations on that argument instead of returning a function.

There is a bunch of advantages here:

  1. You can write specs for your handled inputs and expected outputs.
  2. The compiler might warn you about unexpected types (depending on elixir version)
  3. It's easier to pattern match on the input and do a different kind of validation or transform.

The code you show in the end looks much better and more like normal elixir code. The only thing that I would change here is the unnecessary extraction in the function head. Extraction isn't an anti pattern as such but it's something that can be abused and makes code harder to reason about quickly. See hexdocs.pm/elixir/code-anti-patter....

Collapse
 
abreujp profile image
João Paulo Abreu

Thank you for your thoughtful feedback and for taking the time to read the article! I really appreciate your insights.You make excellent points about the advantages of working directly with arguments rather than returning functions, and I can see how that approach would lead to more idiomatic Elixir code with better tooling support through specs and compiler warnings.However, I want to clarify that my articles are not intended to cover anti-patterns - they are introductory articles focused on presenting the fundamental concepts. Anti-patterns are more advanced topics that I haven't studied in depth yet.These articles are part of a beginner-friendly series where I'm learning and sharing knowledge step by step. My goal is to introduce core concepts like function composition in a way that's accessible to newcomers, even if some of the examples might not represent the most idiomatic or production-ready approaches.Your feedback about the extraction patterns and the link to anti-patterns documentation is valuable, and it's definitely something I'd like to explore in future, more advanced articles once I've built a stronger foundation in these areas.Thanks again for the constructive input - it helps me understand what topics would be valuable to cover as the series progresses!