Original Arduino boards are uncertified ?

I just got a new Arduino Uno board this morning, and got bitten by a strange message in the Arduino IDE (version 1.6.3 from when uploading a first test program:

This board comes from an uncertified manufacturer

Since the board comes from a trusted distributor (Lextronic), I was pretty surprised!

The reason behind the madness

Searching for this error message, I was even more surprised to discover that this is a side effect of a split between two Arduino "kingdoms":

  •, where I downloaded the Arduino IDE. They use the Arduino logo.
  •, which manufactured my "original made in Italy" board. They use the Arduino ® logo. Their board can also be recognized by the Vendor ID 0x2A03.

I made a side-by-side picture comparison of the boards, using photos from the respective websites (the logo difference is particularly small: click to see it fullscreen).

The two official Arduino Uno boards side by side
The two official Arduino Uno boards side by side

How subtle! But behind this surprisingly minor cosmetic difference, I found a good report of the present situation (as of March 2015) in this article: It seems things are getting tensed (trademark lawsuits).

In practice

I shouldn't forget to say that, apart from the annoyingly frightening warning message, my "uncertified" board from is in perfect condition and works perfectly with the current IDE!

(maybe future releases will behave differently, especially with new generation of boards developed separately)

It's sad that probably many people will get bitten by this quite misleading message in the coming months.

PHP trick for FxOS Manifest file

While exploring the development of a Web App for Firefox OS, I encountered a small trouble with the app manifest file: this text file needs to be served with a specific Content-Type header.

As an illustration, this is the content of the manifest of my app (JSON-formatted description of the App). It's hosted at http://é

Normally, this file served as plain text. This can be seen when clicking on the link and opening the Page information dialog in Firefox (Ctrl+I). While plain text may be fine in another context, it is required that the Content-Type header is set to application/x-web-app-manifest+json. More precisely, this is required for the App submission on the Firefox Marketplace. It is recommended on MDN to tune the Apache server configuration with a .htaccess file. However, with my shared hosting plan at OVH, this was not working: the .htaccess file had no effect. [Edit: it seems to work now, so this PHP trick may not be needed anymore !].

To bypass this hosting limitation, I tried with a snippet of PHP code that simply sourcing the original manifest.webapp file, but with the proper Content-Type header.

These two lines were enough to successfully submit my app on the Marketplace. The effect of the header can be seen in the browser: it triggers the Download dialog when opening the url (http://é, instead of displaying the content.

Maybe this trick can help others ?

ImmoCalc app sur Firefox Marketplace


Ma petite application mobile Calculette de prêt immobilier, dite "ImmoCalc", est désormais disponible sur le Firefox Marketplace :

Grâce à l'utilisation d'un petit fichier manifest dit appcache, le site et l'application (car c'est la même chose dans Firefox OS) devraient parfaitement fonctionner hors-ligne,  après bien sûr une première connexion pour tout télécharger. Pour cet app très simple, la techno appcache présente l'intérêt d'être simple (cf. le fichier manifest.appcache).

Petit problème : j'ai remarqué que l'installation sur Android (en utilisant un système de génération automatique d'APK, cf. Open Web Apps for Android) donne une appli qui plante au démarrage. En même temps, cette installation n'est pas très utile à partir du moment où la page web est accessible simplement, avec fonctionnement hors-ligne.

Pour information, la validation de l'app (qui est faite par un humain) a pris une petite semaine.


A small web app for house loans

house-loan-iconOver the last few weeks, I've started to play with jQuery Mobile to build a small web app for mobile phones, and in particular my new Firefox OS phone (ZTE Open C). I've built a simple interactive calculator for house loans.

For now, the app is in French only (I don't know if house loans follow the exact same rules in other countries): "Calculette de prêt immobilier".

The app is hosted for the moment at http://é and it should be usable with any recent browser (mobile, tablet or desktop). Also I may take the time to package it for the Firefox Marketplace for a true offline usage.



New academic position

After graduating in July, I am now an assistant professor at Supélec, effective September 1st. I'm not moving too far away since I'll still be in Rennes, although another campus. I will work in the control team (IETR-ASH), in the field of energy management. This may include work on grid energy storage as I did during my PhD, but also energy management for building, and maybe also some control for the grid itself (e.g. voltage control on the distribution grid).

Many thanks to all my former colleagues at SATIE lab, at ENS Rennes and EDF R&D.The three years I spent with them for my PhD were really a great time!

Icedove 24 not starting due to calendar extension

After upgrading my Debian testing yesterday, I had the bad surprise to see that I couldn't read my emails anymore. I'm using Icedove (aka the unbranded Thunderbird available in Debian) and the software couldn't start anymore.

After a bit of searching, the solution happens to be quite simple. I've summarized the information I found here, in the hope that it may save time for others...

The cause : incompatibility with the Lightning calendar extension

Investigating the issue, I tried to launch Icedove from the command line :

Searching the web, this bug was in fact reported a few months ago and is due to me using the Lightning calendar extension from Mozilla.

Debian Bug Tracker: icedove: incompatible with some external addons (#730450, #724688)

 One message by Carsten Schoenert points specifically to the incompatibility :

if you using lightning from Mozilla this issue is normal. The symbol tables are incompatible between Icedove/libxul and upstream lightning. Please use iceowl-extension instead. The lightning package wan't work with the Icedove packages.

And the solution seems to be : use the iceowl extension instead.

But one question remains: how to fix my email client ? How can I remove the incompatible Lightning extension if icedove doesn't start ?

The solution : removing the extension using "Safe Mode"

It happens that removing this extension was easy enough by launching icedove in Safe Mode, which disables all extensions. I had never used it, but it is easy to do from the command line :

Then, I could simply remove the Lightning extension using the regular add-ons management tab.

Now, restarting Icedove "normally", it just works!

Going forward

I'll still have to investigate the replacement solution of using iceowl extension. But since I haven't been using the Lightning calendar much for some time, I'll probably leave it like that...

Big Batteries Needed ?

From Big Batteries Needed To Make Fickle Wind And Solar Power Work, on NPR :

You can think of a fully charged battery as a source of energy, ready to sell its product to the electric grid, just the way a power plant does. For that to work, battery owners would need to buy electricity to charge the battery when the price is low, and then sell that electricity back to the grid when the price is high.

But that idea turns out to be a dud.

Not many articles in mass media or in scientific journals take the time to explain how useful batteries can be to integrate renewables. But fewer also explains, like this NPR post, that batteries, at their current cost and capabilities, are not ready for the massive deployment on the grid that is predicted by some.

Fortunately, there is much on-going research on battery technology (and on other storage technologies as well). The progress on batteries has been tremendous and steady since their invention (e.g. the impressive improvement of electric model aircraft since 70s), so there may still be technological leaps to come.

The Big Data Brain Drain: Why Science is in Trouble

This is the title of a blog post by Jake Vanderplas, researcher in Astronomy & Machine Learning at University of Washington. He points out that conducting successful research requires more and more data manipulation skills, going along programming skills. However, in academia, ability to write good software is not promoted, if not discouraged !

"academia has been singularly successful at discouraging these very practices that would contribute to its success"

"any time spent building and documenting software tools is time spent not writing research papers, which are the primary currency of the academic reward structure"

On the other hand, software skills are very important and thus well rewarded in the industry, thus the idea of "Big Data Brain Drain" which pumps talented young graduates out of academic research.

After the diagnosis

Jake's post is the "medical diagnosis", and each disease calls for a treatment ! Since the problem is sociological/organizational, the treatment must be sociological/organizational. Jake lays 4 propositions, in particular the evolution of research evaluation criteria. Of course the "implementation details" of evaluation are always a tough issue, not only for research (thinking of learning and teaching evaluation here).

But in general, I hope that the recognition of good software will change positively, along with the general issue of reproducibility. In fact, I think that many academics are aware of the issue, but they just don't see the practical track to recover from the current "dead end" (and also senior researcher don't have much time to thoroughly work on the issue) :

"Making an openly available program for electrical machine sizing would be immensely useful for our research community! It would summarize 20 years of research of our group. I just don't take/find the time for it."

This is an (approximate & very shortened) transcript of the reaction of Hamid Ben Ahmed, one of my PhD advisor when discussing the topic this week. This means that in the field Electrical Engineering (which is has been tied for decades with closed source softwares like Simulink or 3D finite elements models) the feeling that "something is not working" is already there, and that's a good start!

Pushing the change

Now, it is all about academics pushing "le Système" (i.e. French academia), and not waiting for the change to come "from the top". Indeed, I feel that top-level research directors have too many other things to deal with, like managing huge research consortium, writing huge evaluation reports, ... no time for "far away issues" such as reproducibility 😉

Let's just push !

PS : not all of the electrical engineering research runs on closed software. See for example the open source work of Prof. Bernard Uguen and his team on radio wave propagation : (from IETR, a neighbor lab of Rennes University 1)

Open science, reproducibility, coding errors

Just recently going through Fernando Perez' G+, I went across several links on open science and reproducibility of science.

One blog post about the non-evolution of Elsevier publishing policies. As a result, Greg Martin, a mathematician in Vancouver, has decided to resign from the editorial board of Elsevier’s Journal of Number Theory.

I found also two blog posts by Matthew Brett on the NiPy blog ("NeuroImaging with Python" community) about Unscientific Programming and the Ubiquity of Error in computing. The latter asserts that computing tools in science lead easily to results with many mistakes (not to mention the recent discussion about Excel spreadsheets mistakes). From my experience in computing for science, I very much agree with this fact. Often those mistakes are small though (i.e. the order of magnitude of the result is preserved), but not always...

From my research...

Matthew Brett blog reminded of a pretty bad example of error in computing that I encounter when working on my last conference paper dealing with the modeling of a sodium-sulfur battery (cf. PowerTech article on my publications page).

Just in time for the deadline, I submitted a "long abstract" in October 2012. I had just finished implementing the model of the battery and the simulation was up and running. One of the main figure presenting the results is copied here :

Evolution of a figure in my article between Abstract and Final version
Evolution of a figure in my article between Abstract and Final version

and now the interesting thing is to compare the October 2012 version to the February 2013 version (submission of the full paper). Beyond surface changes in the annotations, I've highlighted two big differences in the results. The most striking change is in the lower pane : the "cycle counting" drops from about 400 down to 25 cycles/month. One order of magnitude less !

What happened in the meantime...

Without entering the details, there were several really tiny errors in the implementation of the model. I would not even call these erros "bugs", those were just tiny mistakes. One was somehow like writing Q += i instead of Q += i*dt (omitting the time step when counting electric charges). And when the time step dt is 0.1, that's makes an easy method to miss exactly one order of magnitude ! Spotting those errors in fact takes quite some time and probably one or two weeks were devoted to debugging in November (just after the submission of the abstract).

Of course, reviewers have almost no way to spot this error. First the code is not accessible (the battery model is confidential) and second, the value that was wrong (cycle counting) cannot be easily checked with a qualitative reasoning.

Since I don't know how to solve the problem from the reviewer point of view, let's get back to my position : the man who writes the wrong code. And let's try to see how to make this code a little better.

Testing a physical simulation code

There is one thing which I feel makes model simulation code different from some other codes: it is the high number of tests required to check that it works well. Even with only 3 state variables and one input variable like the sodium-sulfur battery, there are quite a lot of combinations.

I ended up asking the code to report the evolution the mode on one single time step in this ASCII art fashion :

and to cover enough situations, I have in fact 6 text files containing each 5 blocks like this one. That's 30 tables to check manually, so that's still doable, but there is no easy way to tell "oh, that -15,107 over there doesn't looks right"... it just take time and a critical eye (the latter is the most difficult to get).

Freeze the test for the future

If I now want to upgrade the battery simulation code, I want to ensure that the time spent by my thesis advisor and I on the result checking is not lost.I want to enforce that the code always generates the exact same numerical result.

This is the (somehow dirty) method I've used : run the same test, generate the ASCII string, and compare the result with a previous run stored in a text file. I pasted the code here so that the word "dirty" gets an appropriate meaning :

Now, hopefully the simulation code is doing what it should and there won't be a third version of the figure in my article... Hopefully !