The article Will Tech-Driven Deflation Export Japan’s Economic Woes to the World? reminded me of this video, "Evolution of the Desk":

The origin for the above video is http://bestreviews.com/electronics#evolution-of-the-desk.

Each time an object gets digitized that's one less object to manufacture, one less object to ship, one less object consuming raw materials. It's actually more surprising that this hasn't had an even larger impact on the U.S. economy.

2017-04-19

We just finished migrating all of our monitoring from InfluxDB to Prometheus and I thought I'd write up our reasons for the change. Please note that these are my own personal observations and relate to a specific project, these issue may not apply to you and you should evaluate each product for your own uses.

**Update:** To clarify, the versions of InfluxDB and Prometheus that I
am talking about are InfluxDB 1.1.1 and Prometheus 1.5.2.

- InfluxDB
- InfluxDB is a push based system, i.e. your running application needs to actively push data into the monitoring system.
- Prometheus
- Prometheus is a pull based system, the Prometheus server fetches the metrics values from the running application periodically.

With centralized control of how polling is done with Prometheus I can switch from polling every minute to every 10 seconds just by adjusting the configuration of the Prometheus server. With InfluxDB I would have to redeploy every application with a change to how often they should push metrics. In addition the Prometheus pull method allows Prometheus to create and offer a synthetic "UP" metric that monitors whether an application is up and running. For short lived applications Prometheus has a push gateway.

- InfluxDB
- InfluxDB has a monolithic database for both metric values and indices.
- Prometheus
- Prometheus uses LevelDB for indices, but each metric is stored in its own file.

Both use key/value datastores, but how they use them is very different and it affects the performance of the products. InfluxDB was slower and took up substantially more disk space than Prometheus for the same exact set of metics. Just starting up InfluxDB and sending a small number of metrics to it caused the datastore to grow to 1GB, and then grow rapidly from there to 100's of GB for our full set of metrics, while Prometheus has yet to crack 10GB with all of our metrics. And let's not even go into the number of times InfluxDB lost all of our data, either from a crash or from a failed attempt to upgrade the version of InfluxDB that we were running.

**Update:** I was also reminded there's another datastore related issue
with startup time, while Prometheus starts in a matter of seconds,
InfluxDB would regularly take 5 minutes to restart while it either
validated or rebuilt its indices and would not collect data during the
entire process.

Probably closely related to the datastore efficiency, InfluxDB was coming close to maxing out the server it was running on, while Prometheus running on an identical instance peaks at maybe 0.2 load.

- InfluxDB
- InfluxDB uses a variant of SQL.
- Prometheus
- Uses a substantially simpler and more direct querying model.

What would you rather type?

` ``SELECT * FROM "cpu_load_short" WHERE "value" > 0.9`

or

` ``cpu_load_short > 0.9`

- InfluxDB
- Configuration is done through a mix of config files and SQL commands sent to the server.
- Prometheus
- Text files.

Prometheus config is simply YAML files, and the entire config is done via files. With InfluxDB you have to worry that some of the config, for example, creating the named database that metrics are to be stored in, actually gets done. Additionally Prometheus just picks more reasonable defaults, for example, it defaults to only storing data for 15 days, while InfluxDB defaults to storing all data forever, and if you don't want to store all data forever you need to construct an SQL command to send to the server to control how data is retained.

2017-03-18

Geometric Algebra can be applied to Physics, and many of the introductions to GA online cover this, but they immediately jump to electromagnetic fields or quantum mechanics, which is unfortunate since GA can also greatly simplify 2D kinematics. One such example is uniform circular motion.

You should be familiar with all the concepts presented in An Introduction to Geometric Algebra over R^2 before proceeding.

If we have a vector **p** that moves at a constant rate of ω
rad/s and has a starting position **p _{0}**, then we can
describe the vector

Let's figure out what the derivative of a Rotor looks like, by first recalling its definition:

We take the derivative with respect to θ:

At this point observe that *cos* and *sin* just changed
places, along with a sign change, but we know of another operation that does
the same thing, which is multiplication by **I**, so we get:

Not only does the derivative have a nice neat expression, we can read off from the formula what is happening, which is that the derivative is a vector that is rotated 90 degrees from the original vector. Also note that normally the geometric product ins't commutative, but in this case both parts are rotors, so the order doesn't matter.

We can go through the same process to show what happens if θ has
a constant multiplier *k*:

With our new derivative in hand we can now find the velocity vector
for our position vector **p**, since velocity is just the derivative
of position with respect to time.

Again, because we using Geometric Algebra, we can read off what is going on geometrically from the formula, that is, the derivative is a vector orthogonal to the position vector that is scaled by ω.

Note that we've drawn the vector as starting from the position, but that's not required.

We get the acceleration vector in the same manner, by taking the derivative of the velocity vector with respect to time.

And again we can just read off from the formula what is going on
geometrically, which is that we end up with a vector that is rotated
180 degrees from the position vector, and scaled by ω^{2}.

We can place the acceleration and velocity vectors as starting from the positition vector, and that looks like:

Note how simple this was to derive and that the geometric interpretation
could be read off of the resulting formulas. We didn't need to leave the
2D plane, that is, all of these calculations took place in 𝔾^{2}.
The more classical derivations for uniform circular motion rely on the
cross-product which takes you out of ℝ^{2} into ℝ^{3} and
which doesn't work in higher level dimensions.

2017-01-01

Geometric Algebra is fascinating, and I believe solves a large number of problems that arise from a more traditional approach to vectors, but I've been very disappointed with the quality of books and explanations I've found, most of them zooming off into abstract realms too quickly, or spending an inordinate amount of time building up a generalized theory before finally getting to something useful.

Below is an explanation of Geometric Algebra that will start with a simple
two dimensional vector space, i.e. ℝ^{2}. This will be a concise
introduction to 𝔾^{2}, the Geometric Algebra over ℝ^{2},
and then quickly pivot to applications in 𝔾^{2}. This introduction
will not cover the fascinating history of GA, Clifford Algebras, or
Hermann Grassman.

I'll presume a familiarity with Linear Algebra,
and then we'll introduce the geometric product on
that and we'll have the Geometric Alegebra
over two dimensions: 𝔾^{2}.

Linear algebra is the branch of mathematics concerning vector spaces and linear mappings between such spaces. It includes the study of lines, planes, and subspaces, but is also concerned with properties common to all vector spaces. -Wikipedia

You should be familiar with the following axioms and definitions from Linear Algebra:

Associative | (1) | |

Commutative | (2) | |

Identity | (3) | |

Inverse | (4) | |

Scalar Distributive | (5) | |

Multiplicative Identity | (6) | |

Dot/Inner Product | (7) | |

Dot/Inner Product (Alternate) | (8) |

In particular, for ℝ^{2} we have an orthonormal basis:

where:

We know how to do vector addition and scalar multiplication of vectors, and that any vector can be represented as a linear combination of basis elements.

Things to remember about the dot product, or inner product, is that it is 0 for orthogonal vectors:

And that a vector dot with itself gives the square of the norm of the vector, since :

One important thing to notice about Linear Algebra is how often you have
to step outside of ℝ^{2} to get work done. That is, operations
frequently have to take place outside ℝ^{2} or those operations
give you results outside of ℝ^{2}. For example, the dot product of
two vector returns a scalar, which is not a member of ℝ^{2}.

Similarly, to rotate vectors you have to create matrices, which don't
exist in ℝ^{2}, and apply them to vectors through matrix
multiplication.

One final example is the cross-product, which takes two vectors and
operates on them to produce a vector that is orthogonal to the original
two vectors, but if you are in ℝ^{2} it doesn't exist, you have to
then view that cross-product vector as existing in ℝ^{3}, which
the original ℝ^{2} is embedded in.

All of this stands in stark contrast to 𝔾^{2}, where these
operations take place in 𝔾^{2}, in fact, many of the constructs we
use in Linear Algebra, such as rotations, exist as elements of
𝔾^{2}, and applying those operations is just a matter of taking the
geometric product of those objects. Not only is 𝔾^{2} closed under
many of these operations, but the operations exist as elements in
𝔾^{2}.

The Geometric Algebra of 𝔾^{2} builds upon ℝ^{2},
extending it by adding multiplication, i.e. a geometric product.
Before we get to the geometric product we need to first quickly learn
about the exterior product.

The exterior product operates on two vectors and is written as:

The exterior product represents the oriented area defined by the two vectors, or more precisely is represents an oriented area in the plane defined by those vectors, also known as a bivector. There are two important aspects of this, the first is that the exact shape doesn't matter. For example, the bivectors represented below are equal because they have the same orientation (counter-clockwise) and the same area (3).

The second important factor is that the exterior product is anticommutative, that is, if you reverse the order of the vectors involved then the sign of the exterior product changes.

Using two of the vectors above, note that the order that they are used in the exterior product will make the bivectors either clockwise or counter-clockwise.

The properties of the exterior product are:

Associative | (1) | |

Scalar Associativity | (2) | |

Left Distributive | (3) | |

Right Distributive | (4) | |

Anti-symmetric | (5) | |

Zero for Parallel Vectors. | (6) |

In what is going to become a recurring theme, let's look at what this means in terms of basis vectors. Since any vector can be written as a linear combination of basis vectors we get:

If we take their exterior product we get:

So the exterior product of any two vectors can expressed as just a scalar
mulitple of **e _{1}^e_{2}**

Now that we know about the exterior product, we can define the geometric product, which is just the sum of the inner product and the exterior product:

Using just the above definition you can show that the geometric product has the following properties:

Associative | (1) | |

Scalar Associativity | (2) | |

Left Distributive | (3) | |

Right Distributive | (4) | |

Norm | (5) | |

Non-Commutative, except in some cases. | (6) | |

Vector Inverses | (7) | |

Orthogonal vector multiplication. | (8) |

With the geometric product as defined above, and vector addition, our
Geometric Algebra 𝔾^{2} forms a unital
associative algebra with an orthonormal basis:

We can work out a multiplication table for the basis elements, with the observation that if two elements are orthogonal then their dot product is zero, so that implies that the geometric product reduces to the exterior product between orthogonal vectors, which is anti-symmetric. So that implies for each of our basis vectors:

And that implies, by the anti-symmetry of the exterior product:

And the geometric product of any basis element with itself, because they are parallel means the exterior product is zero, so:

Note that we'll end up writing a lot of equations with basis vectors
multiplied together, so it's useful to have a shorthand, i.e.
**e _{12}** will be used as a short-hand for

We can now complete a multiplication table for the geometric product of all the basis elements:

Now that we know what elements of 𝔾^{2} look like and how to
manipulate them, it's now time to put them to work.

Let's start by multiplying two vectors:

Under the geometric product we get:

We can see that from the product of two vectors we get a scalar and a bivector.

What if we take a scalar and a bivector and multiply it by a vector? Note that below we are using a capital letter for our scalar plus bivector.

That product gives us back a vector, so **B** is an element of 𝔾^{2}
that operates on vectors through the geometric product to give us another
vector.

A special case of **B** is called a Rotor. This
Rotor is an element of 𝔾^{2} that is just a restatement of Euler's formula
in 𝔾^{2}.

First, for reasons that will become clearer later, we will begin to
abbreviate **e _{12}** as

If you multiply any vector by this Rotor on the right it will rotate
that vector θ degrees in the direction from **e _{1}** to

For example, here is a dynamic illustration of the Rotor in action,
In this case, we are multiplying **e _{1}** by
e

Caveat: Rotors only work like this in ℝ^{2}, in ℝ^{3} and
above the formulation changes, so be aware of that.

Using geometric algebra makes it easy to read off this formula
and determine what is going to happen, i.e. the **e _{1}**
vector is going to be operated on via geometric product and the result
will be another vector that is rotated

Since our Rotator is a member of 𝔾^{2} it can be combined with
other operations. For example, we could start with a vector *p* at
an initial position and then perturb it by adding it to another vector
that is multiplied by our Rotor. In this case we set ω = 2.

We can take that one step further and rotate the whole thing around
the origin, where we set ω_{1} = 2.9 and ω_{2} =
1.

That might be easier to follow if instead of drawing the vector we draw the trail of points where the vector has been.

Some of the power of Geometric Algebra comes from being able to go back and forth between looking at a problem geometrically and looking at it algrebraically. For example, it is easy to reason that rotating a vector θ degrees twice is the same as rotating that same vector 2 θ degrees. We can write that out as an algebraic expression:

If we expand both sides of the equations above using the definition of
*e* we get:

Comparing the coefficients on the left hand side of the equation to that on the right hand side we find we have derived the Double Angle Formulas:

You could start with the same geometric reasoning about any two angles, α and β, and use the same derivation to get the general Angle sum identities. The power here is the ability to move back and forth between algebraic and geometric reasoning quickly and easily.

From our definition of our Rotator, if we set ω to 90 degrees then
since *cos* becomes 0 we are left with only **I**, which is a
90 degree Rotator. But if we apply a 90 degree Rotator twice we should get
a 180 degree Rotator:

And -1 is exactly what we would expect, since that's what you multiply
a vector by to rotate it 180 degrees. But what we also have is a quantity
in 𝔾^{2} that when squared is equal to -1. This should remind you
of *i* in the complex numbers ℂ, but without the need to take the
square root of a negative number, or invoke anything imaginary. In
fact the subset of all linear combinations of **{1, I}** is closed
under the geometric product and is isomorphic to ℂ.

Now that we have learned about Rotors, let's apply that knowledge to characterize elements of the form:

First, let's look at the relationship between any two non-zero vectors.

We can reason out geometrically that given **b** we can get **a**
from it by first scaling **b** to have a norm of 1, then rotating it to
have the same direction as **a**, and then finally scaling that unit
vector to have the same length as **a**. Now write that out
algrebraically, where θ is the angle between the two vectors.

If we look at any product of two non-zero vectors, **ab**, we know we
get an operator that, under the geometric product, takes vectors and
returns new vectors. If we substitute our derivation of how to get **a**
from **b**, then we get:

So every such operator **ab** is actually just a rotation and a
dilation. We can see this in action if we have the operator **ab** and
apply it to vector **c** to get vector **d**. The animation will
perturb vector **b** to show how that affects vector **d**.

Our generalized form for the geometric product of two vectors is:

We can use what we've learned so far to break that apart into its scalar and Rotor components:

Start by applying **B** to a unit basis element, which we know
has a norm of 1, which gives us a new vector **v**.

We can see from the last equation that **v** has a norm of k,
and now that we know k, we can divide **B** by k to get our Rotor.

While applying the operator **ab** above did show some of the behavior,
it may be useful to start over, this time building our operator from a
ratio, i.e. if we have two vectors **a** and **b**, and given a
third vector **c**, we'd like to calculate the vector **d** so that
they have the same ratio, i.e.

The geometric product isn't commutative, so we have to choose a side to do the division on, so we will write this as:

But that's just a simple algrebraic equation we can solve
by multiplying both sides by **c**.

The operator **ba ^{-1}** should preserve the angle between

Let's see what the difference between **ab** and **ba** is.
First let's multiply out in terms of basis vectors:

If we swap **a** and **b** we get:

In that last step we just factor out a -1 from the coefficient of
**I**. If we substitute:

Then we get:

So if we reverse the order of the geometric product of our vectors we end up with the equivalent of the complex conjugate.

We will note the reverse of the product of two vectors with the dagger.
While this maps to the conjugate in 𝔾^{2}, reversing a product
of multiple vectors will be more important and powerful in 𝔾^{3}.

If we multiply them together we find:

Their product just ends up being a scalar, so if divide by that scalar value we should get:

Which means we've found the multiplicative inverse of **B**.

Normally geometric products aren't commutative, but in this case we can see that we get the same result when we reverse the order of B and B dagger:

So our inverse will work whether applied on the left or on the right.

Let's see how that inverse operates by applying it to
our previous ratio example. This time we'll not only apply
the **ba ^{-1}** operator, but also apply it's inverse
to

Note that starting from conjugates isn't the only way to construct such an inverse, we could, for example, note that because each non-zero vector has a multiplicative inverse, we can come to the same conclusion:

There are other introductions to GA around the web, some of the ones I've found helpful are:

- Geometric Algebra Primer
- Geometric Algebra: An Introduction with Applications in Euclidean and Conformal Geometry

2016-12-21