This page demonstrates the l1tf trend detection library, inspired by Koh, Kin & Boyd, but solving the optimization problem using a gradient descent method that is better suited to efficient implementation in Javascript.

The blue line below represents the log of the the S&P 500 index for the period March 25, 1999 to March 9, 2007. The orange line represents the computed trend. The slider controls the smoothness of the trend. The trend is computed from scratch in realtime every time you move the slider.

How does this work? It's trying to find a trend line that minimizes an error function in two parts:

• First, the total squared error of the trend line vs the original data.
• Second, the total absolute difference in slope between adjacent points on the trend line.
If we wanted to minimize only the first part, we'd just make our trend identical to the original data, and it would have 0 squared error. If we wanted to minimize only the second part, we'd just make our trend any straight line, and there would be 0 difference in slope all along it. Combining them forces us to find straight lines that don't deviate too far from the original data - and wherever they do start deviating, we have to pay the cost of adding a kink and changing the slope.

The smoothness parameter which the slider is controlling trades off between the two types of error: the farther you move it to the right, the less it cares about squared error and the more it cares about consistent slope.

The code is available on GitHub. Example usage:

```var smoothness = 0.5
var change_point_xy_pairs = l1tf(y_values, smoothness).points
```
A note about smoothness: the weight of the slope term in the error is named lambda in the paper. There's a maximum lambda beyond which we just have a single line.

Here, we set

`lambda = (smoothness4)(lambda_max)`
This is somewhat arbitrarily chosen to give a nice subjective feel of steadily increasing smoothness as it goes from 0 to 1.

Authors: