Personal stuff

About me

This is a more discursive description of What I did. For a shorter and more structured description, see my Curriculum Vitae and Publications.


I like physics, mathematics and programming, so after my 2. Vordiplom in physics at the ETH Zürich I switched to the (back then) newly established computational science and engineering curriculum. Being still new you had to do the first two years in another faculty.

In my Diploma I did simulate superconductors with kappa close to 1/sqrt(2) in the Ginzburg-Landau theory using finite element library in C++. This is close to the point in which the Abrikosov vortices created by magnetic field going through the superconductor (in a region of space where superconductivity breaks down) do not interact, i.e. between type I and type II superconductors.


After that, I wanted to gain a bit of money and see private sector. I ended up there because I did like Nextstep, and when I started ETH I wanted to buy a Computer to run it. Using the student discount I bought a Sun Spark station 4, using the money I had gotten from a prize for the best Matura (High School Certificate) in my school for type C (scientific). To get OpenStep to run on it, I had to contact Uptime, the official Next distributor. Being one of the few people having some knowledge of Objective-C(++), and enjoying OpenStep I asked Uptime, that besides being a Next distributor was a firm that did Document management (and handled all receipts of Migros, few GB per week) if there was an opening. I got a job and worked bridging Objective-C and Java, replacing Distributed Objects (DO) as Webobjects was moving to Java, and the core of their technology was still Objective-C. The core of my code (using Omni http client/server code) was used also to interface with SAP (and I had some exposure to interacting with them and building a SAP compliant solution. Initially I planned to stay only for 6 months, and then do my PhD, but I ended up staying for 9 months. After that I left and Uptime, also because Uptime did close (and reopen immediately) after the company having bought it (COSS AG) went into bankrupcy (it was an interesting time, at the end of the dot-com bubble, but that is another story).


I did my PhD at the ETH Zürich with Michele Parrinello. He came to the ETH to be the Director of the supercomputing center CSCS in Manno, but after a bit he stepped down from the CSCS and moved to the Università della Svizzera italiana (USI) in Lugano, while staying ETHZ Professor. Thus, I stayed in Ticino, even if enrolled at the ETHZ. In my PhD I worked on CP2K, a quantum chemistry and solid state physics software package written in Fortran 90, working mainly of the DFT (Quickstep) and MD part.


After a short stay at the Scuola Normale di Pisa, I won a Humboldt Fellowship and went to the research group of Prof. Sauer, at the Humboldt University in Berlin. There I looked into polarizable models and discrete phase space exploration methods. After having used Fortran 90 I wanted to use a more modern language, I looked at various options, I even learnt Haskell, an used it to build an approximate lattice growing method, to build low energy lattice fragments. Finally, I choose to use D language. Like C++ D supports compile time execution, a very useful feature for heavy numerical computations, but in a cleaner way than the functional language tacked on the imperative C++. Back then there was a stable D1, and an in development D2. I decided to use tango an open source system library that offered better performance and features than the default Phobos library. I wanted to have parallelization support, so I wrote a shared memory, work stealing runtime loosely inspired by Cilk. I had some issues with it, and fixing them, I found issues in the runtime, mainly missing or incomplete memory fences in the fiber switching code, which gave rare random issues (the worst kind bug you can have). Fixing them, I ended up becoming the maintainer of the Tango runtime. I developed several nice things in D1/Tango: random number generators, multidimensional arrays and linear algebra wrappers, the shared memory parallelization, serialization, rpc, and discrete exploration of configurational space. Still the community moved toward D2, leaving D1 behind, compilers support was hit and miss, and finally disappeared (it is still possible to get the older compiler but not the newer ones).

I did mentor Dimitri Olsanky together with Andrei Alexandrescu on a google summer of code project to implement compile time optimizations for regular expressions in D2. Something that did improve my knowledge of regular expressions and finite state automata.


C, in the meantime, did improve. It cannot shed its complexity, but new features create a subset that is pretty clean. Through one of the developers active in the D community (on the LDC compiler) I became aware of Nokia, Qt/Qt-Creator effort, and I applied there. I came in in an interesting time, I worked first on an embedded system, and on using zeroconf to detect it in the developer tools (Development tools and in particular Qt-Creator are one of the main products developed in Berlin. QML( is the language and runtime used by Qt to define modern GUIs. I worked on the code model, to better support it, and also a little bit on its javascript based runtime. Given my background on Objective-C I mainly worked on MacOSX. Thanks to some prior work of Andre getting agreements with FSF in place I even managed to have a bug fix I did to gdb (that was affecting debugging in QtCreator on MacOSX) officially attributed to me. I also supported deployment and building on iOS, but that was a frustrating fight of reverse engineering and undocumented features as Apple was not really interested in supporting non XCode IDEs. Finally, I worked on finite state machines, as in the automotive industry they.


It was a turbulent period for Nokia, Digia, and later the Qt-Company, but I got to know and work with very nice people, and had a permanent job. Still, when I heard about Nomad (Novel Material Discovery), I decided to work on that because it looked interesting and close to what I did study. I worked on making the data of atomistic simulation available and provide tools to perform analytics. To do this I worked with many excellent people and several technologies: Docker and Kubernetes, Scala, python, javascript (nodejs), elasticsearch,… as described here in more detail.