Here I'd like to show an example of runtime complexity, arising from a practical problem.

The problem is this: we want to randomly assign coworkers to get each other gifts; we do not want to assign anyone to themself.

We can do this in J, viz.

fixSelf =: ([: */ ~:) (i. @: #)
pickDerangement =: (\$:  [ @. fixSelf) @: (?~@#)
secretSanta =: ] ,: (C.~ pickDerangement)

secretSanta is a monad which will shuffle everyone:

secretSanta ('Macbeth' ; 'Duncan' ; 'Malcolm' ; 'Banquo' ; 'Macduff')
┌───────┬───────┬───────┬──────┬───────┐
│Macbeth│Duncan │Malcolm│Banquo│Macduff│
├───────┼───────┼───────┼──────┼───────┤
│Malcolm│Macduff│Macbeth│Duncan│Banquo │
└───────┴───────┴───────┴──────┴───────┘`

At a high level:

1. A random permutation is selected.

2. If it contains fix-points, we go to step 1. If not, we proceed.

3. We apply the derangement to our original list, and laminate for sake of presentation.

This approach is concise and in fact efficient, but this is not immediately obvious; the worst-case performance is $$O(\infty)$$ (which occurs with probability zero).

In fact, the expected performance is $$O(n)$$: selecting a random permutation is $$O(n)$$, and the probability that a given permutation is a derangement is nonzero for all $$n$$, viz.

$$!n = \lfloor \frac{n!}{e} + \frac{1}{2} \rfloor$$

whence the steps required will follow a Bernoulli distribution with probability of success $$\approx\frac{1}{e}$$ and the expected number of times we'll need to select a permutation is $$e$$.

So we can't prove termination, because it doesn't terminate, but we can prove that non-termination occurs with probability zero. I think this is quite interesting, because neither Idris nor ATS handle this case, despite the fact that such scenarios occur in practice.

Point being: it's possible to come up with example algorithms that stretch the current tools of computer science, but can be handled with pencil-and-paper mathematics.