Showing posts with label applied math. Show all posts
Showing posts with label applied math. Show all posts

Sunday, October 06, 2024

A software mind's eye

I have been writing higher-order functional programming in Python for the last few weekends:

  • hofppy (https://github.com/equational/hofppy) will be a Python library. For the moment it is a collection of Jupyter notebooks.
  • My initial goal was to have a handy FP toolkit which supports applied math with JAX's JIT. Yet I realise that in fact what I am really doing is reimplementing the generic part of a compiler for a reactive (trading) language I wrote in 2010 in F#, while including a few tricks I picked up since then, the primary one being to think of code as implementing a synchronous execution model.
  • There is really very little code like this on the web, therefore why I am doing this open source.

This blog is in part written to mention the above, as already the first JAX JIT supporting monad and comonad models are "nice". Yet this blog is also to bring up the subject of the process of creating new technology.

My recipe to do something new, such as lead a team on a new subject, write software that did not exist before is the following:

  1. Create a "mind's eye" imaginary world of "usage" of the end result.
  2. Imagine how it should work.
  3. Start very quickly to implement features.
  4. Review 1, 2, and 3, as you regularly cycle through them, again, again, and again!
I use the word imagine twice above. For a few reasons...

People are constantly asking me "how do you know?", with regards to client's requirements, technology, design, approach, etc. The reality is that I do not completely know: I know much, I have done much, written hundreds and hundreds of thousands of lines of software. Yet when something is new, no one knows fully how things will look in the end. The reality is the newer it is, the less you know. And... the more you think you know, the more you are fooling yourself. However, just like with poker, the optimal approach is to imagine much and often, and to work as hard as possible to build and maintain a mindful view of "new usage" and "new functioning technology".

The more experience, the better! As your imagination is only as good as the real details it contains. I have tons of real-life learning that I use for this higher-order FP in python. I mention here a few of them:
  • Lazy software design are easier express in code.
    • The most flexible design, is the code that does nothing but accumulates lazily what it could do, until finally because output is expected, it works backwards from output expectations to pull out the desired output.
  • Multi-stack semantics is perfectly normal. 
    • For example, in the project's above monadic code, the free monads have "their" variables, "normal" Python has its variables, and these are carefully kept separate.
    • Multiple flows of causality exist in the real world, stacks of various semantics is the only cheap way to capture these cleanly in software.
  • For every Ying, there is a Yang.
    • If there is more, there is also less; If there are variables, there are constraints, etc.
  • Structure is more important than types.
    • I started in functional programming without the types. Then I embraced types. And yet to understand that the sweet spot is somewhere in between. The reality is that it is much easier to use code to constrain software "just a right level" than to use types.
  • Math is the most powerful structure.
    • Think geometry, topology, and dualities!

 All original content copyright James Litsios, 2024.

Sunday, July 03, 2022

Some favorite early programs I wrote

These are all pre-2000, and in chronological order:
  1. A 'Hunt the Wumpus' like maze adventure program on my Commodore P100 programable calculator.
    • Given the max of 72 program steps and 10 memory cells of the P100, this was probably the first time I used a rudimentary hash noise function (in this case to generate the maze 'on the fly', and not need to store it).
  2. A primitive LISP like eval loop on my HP-41C, with symbolic derivatives and simplification.
    • The integer part of the memory cells were the 'car' part, and the fractional part being the 'cdr' part (with an appropriate scaling).
  3. A Fortran 2d finite element solver, 'mesher' and plotter on my schools 'new' VAX/VMS.
    • The only early program that I still have, as it is printed as an appendix to my 'hand typed' BS EE thesis.
    • A bridge calculation:

  4. A two player 3d wireframe helicopter against tank simulator/shooter game (C and assembly, on my 4MHz Intel 8088 Cannon AS-100
    • The screen and keyboard were split in two,. The game animated a bit more than 100 segments for both the helicopter and the tank. "Space dots" were projected in front of each 'camera' to improve the feeling of motion. Frame rate was about 4 frames per second.
    • Fun fact: the 3d rotations matrix was 'compiled' in real-time to shift and add/sub instructions, as this was faster than using the CPU's multiply instructions.
  5. A Warnock algorithm to render only spheres on my Canon AS-100. Example, 3d triangles were many small spheres, cubes were a recursion of well placed spheres of various sizes.
    • The resulting rendering where very 'organic' because of all those 'spheres' meant that everything looked 'soft',
    • The 'radius testing' code was somewhat similar to the (later) doom distance calculation code, as it was one table lookup of the scaled argument, plus one fix-point iteration to refine the result.
  6. A scan line based design rule checker and layout 'compactor' (at the CSEM, for the analog CAD system we built)
    • Initially Manhattan (orthogonal) geometries, then 45'' angles too, then any 2d geometry.
    • It implemented 'boolean' and 'sizing' operators on 2d polygon geometries which my colleague Thomas Schwarz used to implement a window manager 'for fun' on the serial Tektronix color terminal.  (Slow but ultra-cool effect!)
    • Mentor Graphics integrated this CSEM development into their ECAD suite: 
  7. A symbolic determinants simplifier using minimum weighted matching, used to symbolically extract dominant poles and zeros out of analog circuit sizings.
    • The better algorithm came out the same year and worked on weighted matroids. 
    • It is this type of experience in applied math that gives you a 'bottom-up' understanding of  Laplacian matrices. 
  8. A multi-model hybrid-grid-graph simulator (as part of my PhD). 
    • The generic sparse matrix assembly based on priority queues meant that all sparse matrix formats and parallelization schemes were supported efficiently for very large systems (think supercomputers). As a by-product, compact models could efficiently be extracted from large grid based simulations, which ensured the simulator's and ISE AG's commercial success.
    • Grid properties could be computed using Jean-Bernard Lasserre's recursive simplex volume algorithm.  This generic n-dimensional algorithm was a bit slow but quite useful as many meshing systems provided incorrect volume/area/length values to the simulator for difficult mesh geometries.
    • Some early ISE AG goodies (ISE was later acquired by Synopsys): 

  9. A distributed object protocol with futures in C++ (early Actant).
    • Prior to Eurex only VMS and AIX based APIs existed to quote options on Deutsche Börse. I had a good experience in serializing C++ objects to files (for my PhD). I adapted this code to make a remote object socket protocol to allow our Windows based option quoting system talk to VMS and we used this until Eurex became live (1998 I believe).
    • Remote calls could be chained as a 'future protocol'.  While 'cool', I later disabled the 'futures' functionality having learned that too much generality is sometimes too expensive to maintain. 
    • Some Actant AG goodies: 

All original content copyright James Litsios, 2022.

Tuesday, October 09, 2012

Functional meets Fortran

I decided my life needed a little bit more math, and so to decided to add a few differential equations in some F# code. Math in code means applied math and lots of arrays, but arrays are a pain functional style. So heres my appoach.

I have written many numerial applications, mostly in C++, yet also some in Fortran. I mention Fortran because it has the notion of equivalence, where you can map arrays on to each other.  Equivalence is like C unions for array and there to  reminds us is that arrays do not need to be managed as dynamic memory.

A functional style of programming is a lot about preserving your invariants with immutability, yet immutable vectors are not a solution (yet). Still, numerical code has its own invariants, and therefore some immutability is possible, and should be exploited to write robust code.  In my case, I have vectors and matrices; Their sizes are invariant, and so is the structure of the matrices, they have lots of zeros, they are sparse.  The approach is then to specialize a state monad to maintain size, and structure in an immutable manner, while allocating mutable vectors only within the core of the numerical engine.

Long ago I wrote an matrix assembly algorithm. You can see it as a sparse matrix structure inference code. It was really pretty simple: the final matrix is an assembly of many small sparse matrices, finding the final structure can be seen as solving a set of matrix structure equations, the algorithm defines an order of traversal for all this matrix points and then proceeds to "crawl" across all matrices at once, working out all the local structures. In a functional style of programming, I can use the same newVar like concept you would find in a constraint or inference solver, extend it with an equivalent newEq, to build a monadic environment to support my numerical code. The result is that "application code" just uses variable and defines equations, these are then assembled and only later does the code allocate just a few very large arrays to store the real data associated to the problem

This approach is "easy" because all invariants are built first and then used. Much modern applied math depends on hieararchical and adaptive grids. Hierarchy is probably easy to bring in to a functional style, adaptive grids maybe less obvious.