PyCOZIR

I've started to play with the pretty nice Cozir CO2 sensor from GSS. This relates to research projects on air quality control.

For testing purpose, the sensor is connected to my computer through a USB-serial converter cable. In order to communicate  with the sensor (e.g. grab the CO2 concentration data), I've written a bit of Python code to wrap the low-level ASCII communication protocol into a higher level, more compact API.

For example, instead of exchanging byte codes, reading the temperature becomes:

>>> from cozir import Cozir
>>> c = Cozir('/dev/ttyUSB0')
>>> c.read_temperature()
20.5

For people that might be interested, all the details and the code are on the Github repository: https://github.com/pierre-haessig/pycozir

Also, the repo contains a basic logging program that can be customized to anyone's needs. Tested for now with Python 2.7, on Linux and Windows.

 

The Big Data Brain Drain: Why Science is in Trouble

This is the title of a blog post by Jake Vanderplas, researcher in Astronomy & Machine Learning at University of Washington. He points out that conducting successful research requires more and more data manipulation skills, going along programming skills. However, in academia, ability to write good software is not promoted, if not discouraged !

"academia has been singularly successful at discouraging these very practices that would contribute to its success"

"any time spent building and documenting software tools is time spent not writing research papers, which are the primary currency of the academic reward structure"

On the other hand, software skills are very important and thus well rewarded in the industry, thus the idea of "Big Data Brain Drain" which pumps talented young graduates out of academic research.

After the diagnosis

Jake's post is the "medical diagnosis", and each disease calls for a treatment ! Since the problem is sociological/organizational, the treatment must be sociological/organizational. Jake lays 4 propositions, in particular the evolution of research evaluation criteria. Of course the "implementation details" of evaluation are always a tough issue, not only for research (thinking of learning and teaching evaluation here).

But in general, I hope that the recognition of good software will change positively, along with the general issue of reproducibility. In fact, I think that many academics are aware of the issue, but they just don't see the practical track to recover from the current "dead end" (and also senior researcher don't have much time to thoroughly work on the issue) :

"Making an openly available program for electrical machine sizing would be immensely useful for our research community! It would summarize 20 years of research of our group. I just don't take/find the time for it."

This is an (approximate & very shortened) transcript of the reaction of Hamid Ben Ahmed, one of my PhD advisor when discussing the topic this week. This means that in the field Electrical Engineering (which is has been tied for decades with closed source softwares like Simulink or 3D finite elements models) the feeling that "something is not working" is already there, and that's a good start!

Pushing the change

Now, it is all about academics pushing "le Système" (i.e. French academia), and not waiting for the change to come "from the top". Indeed, I feel that top-level research directors have too many other things to deal with, like managing huge research consortium, writing huge evaluation reports, ... no time for "far away issues" such as reproducibility 😉

Let's just push !

PS : not all of the electrical engineering research runs on closed software. See for example the open source work of Prof. Bernard Uguen and his team on radio wave propagation : www.pylayers.org (from IETR, a neighbor lab of Rennes University 1)