Gérer les versions concurrentes de Node.JS avec NVM

Je ne suis pas un expert frontend, j’ai longtemps rechigné à me plonger dans cet aspect du développement parce que c’était le farwest, que je ne retrouvais pas le cadre, les structures, les patterns que j’apprécie côté backend.

Et puis est arrivé Node.JS, et Angular, et TypeScript, et j’ai réalisé que même avec du JavaScript sous le capot, on pouvait faire des applications riches et propres d’un point de vue code. (ceci-dit, je reste convaincu que le navigateur ne deviendra le nouvel OS que quand il y aura autre chose que du JavaScript sous le capot. A quand la généralisation du WebAssembly ?).

Mais après cette réconciliation sont venus les problèmes, et particulièrement celui de la gestion des versions de Node, de @Angular/cli, des librairies, quand on travaille sur plusieurs projets en simultanées, sur différentes branches. Les conflits de versions arrivent rapidement, on peut perdre des heures à résoudre des messages cryptiques et des problèmes de compatibilités.

Pour maîtriser la version de Node déployée pour un projet donné, sur Windows, la solution la plus simple est Node Version Manager (NVM) for Windows. L’outil permet de basculer d’une version de Node vers une autre en deux lignes de commandes :

nvm install 16.10.0 // pour installer la version de node
nvm use 16.10.0 // pour utiliser une version spécifique

How to list Vertica Tables by number of rows ?

When working with Vertica, you may need to find out which tables have the largest number of rows. To do so, you must tap into the storage_containers system table, that will give metrics about the number of rows, AND the number of deleted rows. But since containers are distributed, you will need to aggregate this data, and ensure to limit your query to the super-projection, which guarantee to contain all table info. In brief :

with num_rows as (
    select schema_name,
           anchor_table_name as table_name,
           sum(total_row_count - deleted_row_count) as rows
    from v_monitor.storage_containers sc
    join v_catalog.projections p
         on sc.projection_id = p.projection_id
         and p.is_super_projection = true
    group by schema_name,
select schema_name,
       max(rows) as rows
from num_rows
group by schema_name,
order by rows desc;

Fixing error “The package org.xml.sax is accessible from more than one module: , java.xml”

After upgrading a dependency on Apache POI from version 4.2.0 to 5.0, my Java 11 Spring Boot application failed to start with error “The package org.xml.sax is accessible from more than one module: , java.xml”.

This issue is not specific to Apache POI, it was introduced with JAVA 9 modules. Here, the problem comes from the java.xml module, which is included by default with JAVA 11, and conflicts with the one included with Apache POI for compatibility with JAVA <= 8.

Thie first thing you need to do in such a cas, is to identify where the conflict comes from, which is done using Maven Dependency tree command:

mvn dependency:tree
[INFO] Scanning for projects...
[INFO] ------------------------------------------------------------------------
[INFO] Building Bartleby 0.0.1-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] --- maven-dependency-plugin:3.1.2:tree (default-cli) @ bartleby ---
[INFO] com.riousset.bartleby:bartleby:war:0.0.1-SNAPSHOT
[INFO] +- org.springframework.boot:spring-boot-starter-data-jpa:jar:2.4.4:compile
[INFO] |  +- org.springframework.boot:spring-boot-starter-aop:jar:2.4.4:compile
[INFO] |  |  \- org.aspectj:aspectjweaver:jar:1.9.6:compile
[INFO] |  +- org.springframework.boot:spring-boot-starter-jdbc:jar:2.4.4:compile
[INFO] |  |  +- com.zaxxer:HikariCP:jar:3.4.5:compile
[INFO] |  |  \- org.springframework:spring-jdbc:jar:5.3.5:compile
[INFO] |  +- jakarta.transaction:jakarta.transaction-api:jar:1.3.3:compile
[INFO] |  +- jakarta.persistence:jakarta.persistence-api:jar:2.2.3:compile
[INFO] |  +- org.hibernate:hibernate-core:jar:5.4.29.Final:compile

Once identified the problematic dependency, you have 3 options to fix the conflict :
A. upgrade the libraries to a Java 11 compatible version without transitive dependencies,
B. exclude the conflict explicitly in POM dependencyManagement, or
C. avoid the conflict by only importing the classes needed and do not use wildcards (*) in import statements.

I chose the 3 option, and excluded the xml-apis dependency for poi-ooxml :

		<!-- https://mvnrepository.com/artifact/org.apache.poi/poi-ooxml -->

mvn dependency:tree

Measuring JUnit CodeCoverage with Jacoco

Once you start working with unit tests, and that you understand how to use and write them, it’s impossible to go back. They help you decouple code, gain confidence in the behaviour you’re implementing, makes you faster by avoiding to start, and restart, a whole application.

Yet, one thing I never used up to now was code coverage. I just never took the time to dig the topic, even though that’s clearly useful. It helps maintain a team effort to ensure all new code is tested, and it gives you an idea of the risk you’re taking when chaging a line of code. Because the number of tests doesn’t give you any idea about the coverage, they may be focused on one part of the application only, or at the opposite cover a little bit every part, but not completely.

It appeared that adding code coverage measurement to a maven Java project using JUnit was as simple as adding the JaCoCo plugin to the POM file. As usual, Baeldung is your reference, but in brief :


The JaCoCo plugin will output a jacoco.exec binary file, which can be consumed by some third party tools, like SonarQube. However, you can generate a more human friendly report using the repot goal :

mvn clean jacoco:prepare-agent install jacoco:report

You’ll then have a target\site\jacoco\index.html report detailing your code coverage :