Friday, November 10, 2006

Type inference library

Having a bit of time tonight, I decide to pick on my type inference library:
  • It is C++
  • Class abstraction is used to define the properties of types, as well as the data dependency as a graph of nodes and edges.
  • The graph is a hyper graph: edges are multi-nodes
  • Normal usage would tie nodes to variables, edges to assignments and function calls, graphs to function definitions and instantiation "cache" table.

The astute reader will note that this supports the Agesen's cartesian product algorithm that I am using.

Monday, November 06, 2006

volatility of index versus underlying stocks

I recently found this very interesting article:

It interesting because it ties theory down with practice.
The math also looks interesting: a bit more complex than what one usually finds in this type of paper.

Saturday, September 23, 2006

Invariants in software architecture

An invariant is a property that does not change.
A software architecture is the plan of a program.
The invariants are the properties of the program that do not change during its execution.
Programs are all about changing things so invariants are an important concept: they define what is not changed.

Friday, September 01, 2006

Books at my bedside

Everybody likes to relax with before going to bed. These are the books I have at my bedside:

The randomized algorithm is an all time favorite and so is the approximation algorithms. I was not surprised to to see that amazone offers them together.

The combinatorial search book is one of those books that I take out from time to time because it just feels like to right knowledge to keep fresh in ones mind. I will admit I have never used anything that I learned from it.

The software estimation book is good but is really not my type of bedside book because it is to close to work. I have had it there for many months now but have not touched it, it does not allow me to relax.

The implementation of the term rewriting is good fun although it is a bit formal and does not offer the same bang for the same amount of time put in to the two first ones!

Monday, August 28, 2006

History of Human Computers

Google has a video on a history of human computers
presented by David Alen Grier who wrote a book on the subject (When computers were human).

Half way through, I started to feel that the talk was centering too much on the human part of the story but then I realized that the non-human part had been presented by many, many times before so, yes, it is interesting to know who were these people that spend their days writing down calculations.

I especially liked the part where he mentions function tables because it brought me back to my youth when I would browse through the mathematical handbook that my dad had long stopped using and read numbers, the formulas too of course. The numbers where interesting because they are a bit random, which in itself is a facinating concept (if you are young enough!).
The tables of powers of two where also good fun!

Anyways, I found out that human computers where not well respected as the were effectively clerks working under the command of the "planners".

Friday, August 25, 2006

Strategy versus execution schizophrenia

By nature I am a strategy guy. I spend my time thinking about the future mostly because I a really do believe that "only the paranoid survive" ( yet I have worked long enough to understand that really execution is king. I still find this reality very difficult because "going analytical" is a bit like swimming underwater: when you get back the the surface you realize that you have been away for some time and things may have changed without you noticing. This is where I find the magic of time management does the job: have regular meetings to allow you to monitor and interact with the execution and leave the rest of the time for strategic free thinking.

Thursday, August 24, 2006

FPGA soon mainstream?

With the advent of multi-core processors it is clear that at some point we are better off by replacing one of those core processor cells with a FPGA type of cell. I understand that is already the case for specialized multi-media processors but when do AMD and Intel add it as default?

Wednesday, August 23, 2006

Wide usage of Finite Volume Method for Hyperbolic Problems

I have not been playing around with finite volume numerical methods in more than 10 years and I was surprised to see widely they were now being used to solve fluid dynamic type problems. I first noticed the usage on where I found . But then a few searches on google showed up so many results that can't even find out when the method became widely used.

Some math for finance books (quant stuff)

Today chose some book out of my library for a senior mathematician who wants to learn about financial mathematics:

Wilmott, Holton, and Kloeden-Platen are must have reference books. Voit is fun and the others are good education at different levels.

Tuesday, August 22, 2006

Corrupted languages versus functional languages

I like to program but as I get older I am less and less happy about the fragile nature of code. I firmly believe that if you have the right people you should invest in more formal aproaches to software development. The tricky part is there are not enough people that understand that any gap in invariant properties at one layer of software will corrupt the layers above. Languages like C++, Java, C# oblige the developer to choose a subset of the language to build a robust core and then to extend it with limited features of the language. This is fundementally a difficult thing to do and why I spend much time thinking in term of functional languages like scheme, haskell, ocaml because they are not corupted languages.

A note that a good way to learn functional languages is to start with Q: , a simple term rewriting language. Its easy to setup and use.
Personally, I prefere a term rewriting anotation to a lambda based notion to write programs.

This said look at , these guys have understood what I am talking about.

A little noise to nudge people into taking risk

I listened to Aaron Brown present his idea that for an exchange to be efficient the deals need to be incentivised with a chance of large payouts. This makes sense to me, what is more motivating: Getting the average every times or getting a random return but with the chance of a high value? I think human nature cannot accept the hard reality of what an average return implies and naturally will search for risk in order to give him a chance for a better return.

Algorithmic trading

It is interesting that some trading models are "built-up" from domain specific ideas, a bit like in physics, while other models a created out of data with little domain specific knowledge. Google now translates text with algorithms that have learned only from existing text and have a very abstract notion of similarity in sequence of words. Likewise, people use neural networks in financial models. With these models it is often not easy or even possible to explain "how" a result was found. When decisions need to made in milliseconds it is not possible to double check with humans before an action is taken. The game is then to make more money than to lose it while not even really understanding the why of the actions taken.