Découverte des outils Google : Google Ads, Google Analytics et Google Tag Manager

Je viens tout juste de découvrir les outils marketing de Google, et je dois dire que c’est à la fois fascinant et un peu déroutant. Entre Google Ads, Google Analytics et Google Tag Manager, il semble y avoir une synergie étroite, mais leur complexité peut rendre leur prise en main intimidante. Voici mes premières impressions et ce que j’ai compris jusqu’à présent.

Une interconnexion entre trois outils

Apparemment, ces trois outils fonctionnent ensemble, mais ils étaient à l’origine des programmes indépendants. Cela crée parfois des confusions, car des termes comme « conversions » ou « événements » (« events ») sont utilisés dans plusieurs contextes, mais avec des significations différentes selon l’outil. Google semble avoir essayé de clarifier ces chevauchements en renommant certains éléments, mais cela complique l’utilisation des tutoriels plus anciens, qui ne sont parfois plus pertinents.

Les fonctions principales des outils (d’après ce que j’ai compris)

Google Tag Manager : centraliser les balises

Google Tag Manager (GTM) est un outil pour gérer des balises (« tags »). Ces balises servent à suivre des événements ou des interactions sur un site web. Ce qui est intéressant, c’est que GTM ne se limite pas à l’écosystème Google—il peut aussi être utilisé avec d’autres plateformes comme Facebook ou LinkedIn. Je pense qu’il s’agit d’un outil pratique pour centraliser la gestion des balises et améliorer la cohérence des données.

Google Analytics : comprendre le comportement des visiteurs

Google Analytics semble être l’outil principal pour analyser ce que les visiteurs font sur un site. Il permet de répondre à des questions comme :

  • Qui visite mon site ?
  • D’où viennent ces visiteurs ?
  • Que font-ils sur le site (parcours, durée de visite, etc.) ?

Une chose que j’ai remarquée, c’est la notion d’événements (« key events »), qui étaient apparemment appelés « conversions » auparavant. Ces événements permettent de suivre des actions précises, comme un clic pour prendre rendez-vous ou accéder à un simulateur. Cela semble être un bon moyen de mesurer l’efficacité des pages.

Google Ads : créer et suivre des campagnes publicitaires

Google Ads est l’outil pour créer des publicités en ligne. Il y a deux parties principales :

  1. La création de campagnes : choisir les mots-clés, définir un budget et rédiger les annonces.
  2. Le suivi des performances : mesurer si les publicités atteignent leurs objectifs.

D’après ce que j’ai compris, l’intégration avec Google Analytics permet de synchroniser les conversions (ou « key events ») pour analyser l’efficacité des campagnes.

Un exemple basé sur mes tests

Pour essayer de mieux comprendre, j’ai imaginé une campagne visant à promouvoir un simulateur de droits de succession. Voici comment les outils interagissent :

  1. Google Tag Manager : Création d’une balise pour suivre les clics sur le bouton d’accès au simulateur.
  2. Google Analytics : Analyse des visiteurs qui arrivent sur la landing page, cliquent sur le bouton ou soumettent leurs informations.
  3. Google Ads : Diffusion de publicités et mesure des conversions grâce aux key events importés depuis Analytics.

Cela m’a permis de voir comment les différentes parties peuvent être reliées pour donner une vue d’ensemble. Cependant, je ne suis pas encore sûr de tout… il y a probablement des subtilités que je n’ai pas encore captées.

Conclusion

Ces outils sont à la fois fascinants et complexes. Je n’ai pas encore tout compris, mais il semble clair qu’ils peuvent offrir des résultats impressionnants si on les utilise correctement. Je pense qu’il faut du temps pour se familiariser avec leurs fonctionnalités et leur interaction, mais cela en vaut probablement la peine pour optimiser une stratégie marketing.

Migrating a NAS based organization to Microsoft SharePoint and Microsoft 365 Groups

Here’s the context : you are a lawyer, working on many business cases. You rely on a local file server (NAS) to store all your contracts, documents, files. There’s one folder for each business case. You’re successful, and hire one employee, then another one. You also start collaborating with experts outside of your organization, and you need to manage collaborative work on all your Word, Excel, PowerPoint documents. You need to manage access to the files because not everyone has access to everything. Moreover, you’d like to follow who is doing what, and ease communication on each business case.

That’s when everything crumbles. Files get duplicated, you don’t know which version is the most up to date. Emails are out of control, because people tend to include the whole planet to ensure all persons concerned will receive their email, or at the opposite they click on “reply” instead of “reply all” and forget to keep updated a critical person.

Your wish list to Santa then becomes something like this :

  1. All my files should be stored in a single place, accessible from within and from outside my organization
  2. My files should be versionned, so that I can easily restore a previous version
  3. One must be able to edit concurrently a document, without facing potential painful merge process
  4. I should be able to easily grant or remove access to my files
  5. I would have a global strategy for access ocontrol to my documents, while still being able to fine tune permissions for specific resources or collaborators
  6. A web page or wiki, would give an overview of recent activity on each case
  7. A mail adress should be associated to the case, to ease document transmission
  8. I should be able to define tasks, and their owner, to keep track of Who’s doing What, When.
  9. A calendar would allow me to keep track of events related to each case

All this is what you would get by switching to a Microsoft 365 environment, through a SharePoint Team site for each case, coming with its associated Office 365 group, its default documents library, email addresse, home page, Ms planner plan for tasks management.

However, if you have an existing environment, with tens or hundreds of opened cases, migrating manually will require an awful lot of time, and will be error prone.

Happily, this can be automated using PowerShell scripts. Microsoft provides PowerShell extensions for its clound environments. However, we will rather use PnP PowerShell, “a cross-platform PowerShell Module providing over 650 cmdlets that work with Microsoft 365 environments”. This will make the creation of the SharePoint Team site, and of all the associated resources a breathe (or almost a breathe).

The PowerShell scripts below will take care of creating the SharePoint Team Site, and uploading to its documents library all files and folders from a local folder.

But first, you need PowerShell 7. On my Windows 11, the PowerShell Version was 5.1. You can manually download and install PowerShell 7 from the Windows Store, as described here.

Then, install Pnp PowerShell Module, as described here, or download the setup.ps1 script

Install-Module -Name PnP.PowerShell
Import-Module PnP.PowerShell

Then, given a folder, the targeted root SharePoint site, new site name and owner, and local folder to upload, the migrate.ps1 script will take care of creating the SharePoint site with the uploaded files. Ex :

.\migrate.ps1 https://riousset.sharepoint.com 'Affaire 123' '.\Affaire 123\' nicolas@riousset.onmicrosoft.com affaire-123 

The migrate.ps1 script :

if ($args.count -lt 5) {
	$scriptName = $MyInvocation.MyCommand.Name
	write-host "Utilisation:" $scriptName "<url racine du site sharepoint> <nom du site à créer> <répertoire à uploader> <owner email> <site email>"
	exit
}

#Config Variables
#$AdminSiteURL = "https://riousset.sharepoint.com"
$AdminSiteURL = $args[0]
$displayName = $args[1]
$folder = $args[2]
$owner = $args[3]
$mailNickname = $args[4]

$description = $displayName
 
Try {
	#Connect to PnP Online
	Connect-PnPOnline -Url $AdminSiteURL -Interactive
 
	#Create a new Office 365 group
	# New-PnPMicrosoft365Group -DisplayName $displayName -MailNickname "HRAdmin" -Owners $owner -Members @($owner) -IsPrivate
	$site = New-PnPMicrosoft365Group -DisplayName $displayName -Description $description -MailNickname $mailNickname -Owners $owner -Members @($owner) -IsPrivate 

	write-host ($site | Format-Table | Out-String)

	& "./upload.ps1" $site.SiteURL $folder
}
Catch {
    write-host -f Red "Error:" $_.Exception.Message
}

Whic itself calls the upload.ps1 script to populate the ShrePoint documents library from the given local folder :

# Upload file to site
# https://theitbros.com/powershell-upload-file-to-sharepoint/#penci-Install-the-PnP-PowerShell-Module

if ($args.count -lt 2) {
	$scriptName = $MyInvocation.MyCommand.Name
	write-host "Utilisation:" $scriptName "<url racine du site sharepoint> <répertoire à uploader>"
	exit
}

# What is the target SharePoint site URL for the upload? 
# $spoSite = 'https://riousset.sharepoint.com/sites/HRAdmin2/' 
$spoSite = $args[0] 

# What is the location of the files to upload? 
$localFolder = $args[1]

# What is the target document library relative to the site URL? 
$spoDocLibrary = "Documents partages"

Connect-PnPOnline -Url $spoSite -Interactive

Get-PnPSite

function Upload-Folder {
	param (
        	$localFolder,
        	$remoteFolder
    	)

    Write-host 'Uploading folder' $localFolder.toString() ' to ' $remoteFolder.toString()

    $resolvedRemoteFolder = Resolve-PnPFolder -SiteRelativePath $remoteFolder


    $files = Get-ChildItem $localFolder -File

    foreach ($file in $files) { 
        write-host 'Uploading file ' $file.FullName.ToString()
	$uploadedFile = Add-PnPFile -Path ($file.FullName.ToString()) -Folder $remoteFolder -Values @{"Title" = $($file.Name) } 
    }

    $subfolders = Get-ChildItem $localFolder -Directory
    foreach ($subfolder in $subfolders) { 
	$folderName = Split-Path $subfolder.FullName -Leaf
        Upload-Folder -localFolder $subfolder.FullName -remoteFolder "$remoteFolder/$folderName/"
    }

}


Upload-Folder -localFolder $localFolder -remoteFolder $spoDocLibrary

If the scripts, you should see the new site appear in your SharePoint admin portal, usually https://<tenant>-admin.sharepoint.com/

How to delete a 0 Kb file reported as a not found by Windows ?

I ran into this incredibly frustrating issue while generating some CSV files for GPT Assistants : a 0 kb file had been created, and when trying to delete, it failed with a windows error message reporting that the file didn’t exist :

The file properties didn’t look odd, except for the 0 Kb size, and the missing extension :

After testing some command lines with “del”, trying to rename the file, move it, delete the whole folder using third party tools like Total Commander, the only thing that worked was that command line to remove the whole folder :

rd /s "\\?\d:\code\xml2csv\target\test-classes"

And eventually, I found the issue : the problematic filename was generated automatically by an app, and contained invalid characters : “Code civil – Titre préliminaire : De la publication, des effets et de l’application des lois en général.csv”, which were responsible of the invalid file state. The filename was displayed only up to the “:”.

Release Failed when publishing to Sonatype

While trying to publish a new version of my fork of mysql-backup4j, the release of the latest version failed with error “Event: Failed: Repository Writable”, whether I tried to publish using “mvn deploy” (whatever the value of the autoReleaseAfterClose setting), or from the nexus repository manager portal.

The maven “mvn deploy” command returned the following error

[ERROR]
[ERROR] Nexus Staging Rules Failure Report
[ERROR] ==================================
[ERROR]
[ERROR] Repository "frneolegal-1040" failures
[ERROR]   Rule "RepositoryWritePolicy" failures
[ERROR]     * Artifact updating: Repository ='releases:Releases' does not allow updating artifact='/fr/neolegal/mysql-backup4j/1.2.3/mysql-backup4j-1.2.3.jar'

While the Nexus Repository manager displayed this one

Once identified, the issue was kind of obvious. My maven “target” folder hadn’t been cleaned prior to the compilation, and the artefacts of the previous version were still present. The deploy operation was pushing the new and old artefacts, triggering a release failure, since the previous version had already been released.

Deleting the “target” folder manually – or even better using the “mvn clean” command – solved the issue. And to prevent any future occurence, just use the following maven command :

mvn clean deploy