On the road to functional programming enlightenment, I discovered two beautiful one-liners for depth-first and breadth-first search in Haskell. Here they are:

```
dfs, bfs :: (a -> [a]) -> a -> [a]
dfs succ start = start : concatMap (dfs succ) (succ start)
bfs succ start = concat $ takeWhile (not . null) $ iterate (concatMap succ) [start]
```

Both functions generate the list of states reachable from a starting state in a finite number of steps. The inputs to `bfs`

and `dfs`

are the starting state and a “successors” function taking a state to the list of its “child” states.

What consistently impresses me about Haskell is that algorithms can be expressed in code as concisely as in English, *if only you can learn the language to the same level of native fluency*. As a native English speaker who’s only been writing Haskell casually for a year, I puzzled over the two one-liners in this post for about an hour before getting them right!

# Explanation

For the simple one-liner versions, I had to assume that the search space was a tree, i.e. no two paths to the same state. This special case is useful in practice, for example in problems that build a sequence of words or letters one step at a time. And since discovering it, I’ve used `dfs`

exactly as above in a fun and simple palindromes-generation program which I may post here soon.

But for illustration, consider the following crudely rendered test tree in which states are just numbers rather than sequences of words:

```
1
/ | \
2 3 4
/ | \ / | \ / | \
5 6 7 8 9 10 11 12 13
/ \
14 15
```

Here it’s intended that 2, 3, and 4 are the successors of 1, and so forth. To represent this tree in a way compatible with the `bfs`

and `dfs`

combinators above, we need to describe successors as an `Int -> [Int]`

function, e.g.

```
testSucc 1 = [2,3,4]
testSucc 2 = [5,6,7]
testSucc 3 = [8,9,10]
testSucc 4 = [11,12,13]
testSucc 8 = [14,15]
testSucc _ = []
```

With this test data in place, depth-first search is simple to describe: it immediately yields the start node, then recursively calls `dfs`

(with the same successors function) on each successor of the start node, and concatenates those results, or

```
dfs :: (a -> [a]) -> a -> [a]
dfs succ start = start : concatMap (dfs succ) (succ start)
```

as we said above. After loading our code into the GHCi REPL, we have

```
*Main> dfs testSucc 1
[1,2,5,6,7,3,8,14,15,9,10,4,11,12,13]
```

as we should expect, and

```
*Main> dfs testSucc 2
[2,5,6,7]
*Main> dfs testSucc 3
[3,8,14,15,9,10]
*Main> dfs testSucc 4
[4,11,12,13]
```

which, concatenated, give `tail (dfs testSucc 1)`

. Cool.

I’ll note that the restriction to trees can certainly be patched, in the totally standard way by tracking a set of visited states so that no state is “expanded” more than once. A cost is paid in elegance as our code has ballooned from *one* line to *four*!

```
dfs2 :: (Ord a) => (a -> Set a) -> a -> [a]
dfs2 succ start = loop [start] (Set.singleton start) where
loop [] _ = []
loop (x:xs) visited = x : loop (Set.toList new ++ xs) (Set.union visited new)
where new = Set.difference (succ x) visited
```

Breadth-first search can be written nearly identically, by changing the “frontier” (first parameter to `loop`

) from a stack to a queue, though efficient queues are not entirely trivial to implement in functional programming (whereas Haskell’s immutable linked lists make fine stacks).

And so I was pretty happy to find a functional `bfs`

one-liner without explicit mention of queues. It’s just slightly more complicated than `dfs`

, the meat of the definition lying in the expression

```
iterate (concatMap succ) [start]
```

which generates a list of all the “levels” of the search space. Here I mean that the singleton list `[start]`

is level 0, successors of start constitute level 1, successors of level 1 states constitute level 2, and so on. That’s what `concatMap succ`

does: for a list of nodes in some level `n`

, it finds all successors of all of them, i.e. it generates the next level `n+1`

.

Meanwhile `iterate`

, which has type `(a -> a) -> a -> [a]`

, is a very useful combinator and really a beautiful poster child for Haskell’s lazy semantics. Given a starting value, it generates an infinite list by repeatedly applying a function to the last generated value. Just don’t try to print the whole thing!

```
*Main> take 10 $ iterate (*2) 1
[1,2,4,8,16,32,64,128,256,512]
```

Or in the context of `bfs`

, of course we get the levels of the tree:

```
*Main> take 10 $ iterate (concatMap testSucc) [1]
[[1],[2,3,4],[5,6,7,8,9,10,11,12,13],[14,15],[],[],[],[],[],[]]
```

with infinitely many empty levels at the end. This is easily cleaned up to a working implementation of breadth-first search as above:

```
bfs :: (a -> [a]) -> a -> [a]
bfs succ start = concat $ takeWhile (not . null) $ iterate (concatMap succ) [start]
```

In GHCi, we get

```
*Main> bfs testSucc 1
[1,2,3,4,5,6,7,8,9,10,11,12,13,14,15]
```

as expected.

“What consistently impresses me about Haskell is that algorithms can be expressed in code as concisely as in English, if only you can learn the language to the same level of native fluency. As a native English speaker who’s only been writing Haskell casually for a year, I puzzled over the two one-liners in this post for about an hour before getting them right!”

I found this amusing because one way of reading it is that you’re impressed by how Haskell forces newcomers to struggle for hours to understand the code.

I think I know what you mean, though.

I’ve heard that Haskell teaches a way of thinking, such that when you program in other languages, it can be helpful to imagine what you’d write in Haskell. This is what attracted me to start learning the language a few months ago. And what you’re saying sounds similar.

It sounds like you’re saying that Haskell can express programs as concisely as English, but perfectly precisely. Even if you never do this in published code, for fear of confusing your readers, it’s still good for you, for your own thinking, to be able to do this.

I used to sometimes write my mathematical thoughts using the notation of formal symbolic logic. This was very tedious, though. I’m hoping Haskell will give me a way to achieve that kind of precision of thought when I want it, without the clunkiness of symbolic logic.

Thanks for your comment. I do endorse the idea that “a different way of thinking is good for you”, though this isn’t the main point I was trying to make in that paragraph. AFAIK, no other semi-mainstream language is lazy by default, and this basically forces Haskell to be side-effect free, which of course is wildly at odds with *actually* mainstream languages.

But I really just meant that I feel I could be amazingly productive in Haskell if I spent the time to learn it better and then just *knew* when to use helpful combinators like the fold variations, iterate, concatMap, and all the monadic versions, e.g. mapM and so forth. Readers familiar with Haskell idioms would probably find the resulting code *less* confusing, though I agree such terseness can challenge or even alienate newcomers.

For a kind of trivial example outside of the post: when I was first learning Haskell, I wanted to define the function

applyAll :: [a -> a] -> a -> a

which applies a list of functions to a single element, e.g. applyAll [f,g,h] x == f (g (h x)). This is easy enough with recursion (which I used at the time), but it’s a clean one-liner point free:

applyAll = foldr (.) id

Moreover the point-free formulation reminds us that this function should really be called composeAll, which sounds like a more fundamental operation.