For the first time in a long while, I had considered recording an episode of Doubtfully Daily Matigo to outline something that occurred today at the day job that just rubbed me the wrong way; so much so that I felt offence by the very idea that was presented. Despite being excessively opinionated, it takes a great deal to offend me. This is primarily because I try not to take too many things so personally or seriously that discussion becomes impossible outside an echo chamber. However, today at the end of a meeting, I was asked a rather simple question:

We need to add some JavaScript to every page. Is this done through the "Additional HTML" section of the admin pages?

This was a question regarding a bit of software that is being prepped for use at the day job. The JavaScript that's being added is designed to track what a person is doing on the website in excruciating detail. Where is the mouse? How much time passed between actions? What did a person click? And a whole lot more.

This I find offensive. The entire modern web is offensive.

Just about every site that we visit has trackers in place to extract as much data as possible from us, from browser details to frequency of visits to what colour socks we're wearing. What the fuck for? I do not buy the various arguments that companies have for the excessive amount of data collection that goes on behind the scenes when we're using a website.

Does a company need to know how long we're on a page? No.

Does a company need to know where a cursor is positioned while we're on a site? No.

Does a company need to know that we've visited a site 50 times in the last week? No.

Should a company use a third-party service to collect "metrics" that are compared and collated against information data collected on other sites for the same visitor? Fuck! NO!

This isn't to say that organisations shouldn't have the ability to record some data about the people who use their services, but there needs to be a clear contract between the website and the visitor before any collection starts to take place. A lot of websites fail in this regard, some more spectacularly than others.

10Centuries does record information for every web request. I've outlined what it is that's recorded in previous articles, but here is the list again:

  • the UserAgent sent by the browser (so that I can see what browsers are more common, which dictates bug-fixing efforts)
  • the IP address of the visitor (not that this value means very much anymore)
  • the resource the visitor requested
  • where a visitor came from (if known)
  • how long the whole process took to complete

This information is primarily used for 2 purposes:

  1. To work out the mean, median, and mode values for server response times. If it goes above 0.3 seconds, I start investigating bottlenecks. My job as a provider is to ensure that content makes it to visitors in a fast and efficient manner.
  2. To work out how widespread a bug might be. Sometimes I'll learn that a function isn't working quite right in Opera or Firefox. Then I'll look at the stats and see there are fewer than 0.4% of the visitors to all of the 10C sites using these browsers. At this point I can decide whether it's worth solving the problem right now or later in the day.

Anything beyond this amount of data is too much. Could I record more? Yes, of course. But to what end and at what price? Companies that collect far too much data generally get put on my blocklist rather quickly. If a project I'm responsible for at the day job is also added to this blocklist, then my job becomes exponentially more difficult. I will not soften my stance on trackers, even for the sake of employment. Fuck that. Online surveillance needs to stop. Not only is it excessive and unenlightening, it's downright offensive.