Hope is the last to die – the attempt of a conclusion (7/7)

Privacy - The next big thing?

With about a 6-month “corona delay” I would like to finish my series of articles “Flying blind through online attribution”. When I published the first articles at the beginning of the year, nobody could foresee how the whole topic would develop. But the involuntary pause has something good about it – a lot has happened and a certain development can be seen.

First of all I would like to thank you for the many inquiries and the resulting discussions. I think it is right and also important that the industry intensively exchanges information and experiences (both positive and negative) are circulated here.

How the elimination of the “privacy shield” opens a new front

In July of this year, the ECJ once again issued a groundbreaking ruling: the privacy shield agreement with the USA was overturned (see eRecht24). The ruling is so far-reaching because it opens up a fourth construction site for online attribution. Up to now, these were divided into:

  • Technology (e.g. AdBlocker)
  • Legal (GDPR)
  • Usability (Cross-Device/-Browser)

So now a 4th problem is added: Organizational. Admittedly, it is closely related to the legal problem, but it differs substantially in one respect: it affects not only marketing or tracking solutions, but almost every service, every tool, every SaaS or other cloud-based solution that comes from providers in the USA. Dr. Ronald Kandelhard provides a nice summary of the consequences in his article “Your website is illegal: Privacy Shield violates GDPR (German)“.

This means that almost any data transfer to or via a provider from the USA , where personal data is also transferred, is illegal with immediately. It does not matter whether the service provider receives the data as a commissioned data processor – a transfer to the USA is not possible due to the lack of data protection standards there. In this context, reference is often made to the standard contract clauses. However, the European Court of Justice has made it directly clear that further safeguards must be obtained in this way to ensure compliance with data protection under European law. In my opinion, however, this is unrealistic in practice.

As a result, a large number of popular services are virtually unusable from one day to the next – and this time it’s not just marketing or tracking tools. It hits many cloud-based services that are used in the background – from all industries and areas:

  • Office/Collaboration: G-Suite, Microsoft 365, …
  • Project management: Asana, Trello,…
  • Messenger: Slack, MS Team,…
  • E-Mail: Mailchimp, Sendgrid,…
  • CRM: Salesforce, Zoho,…
  • Data-Migration: Funnel.io, Segment, Stitch,…
  • Monitoring: Sentry, Datadog,…
  • and many more

Whether and when legal certainty will be provided again is currently not foreseeable. You can now either try to keep your IT stack and hope not to be “caught”, demand further guarantees from the US companies or start looking for (European) alternatives. In either case, you either risk a violation of the GDPR or a high organizational effort is required.

To come back to the actual topic of this series of articles – the omission of the privacy shield has only indirectly something to do with attribution. But it complicates the problem and shows that the privacy issue will not solve itself! The technical development, political decisions and also court decisions show: In my opinion, data protection is “the next big thing” – and it doesn’t matter how you personally feel about it.

So what to do?

Basically there are 3 possibilities:

  1. The “head in the sand” method
    Just carry on as before. Integrate Google Analytics via GTM, load Facebook and a few other retargeting pixels. Leave the outdated data protection regulations, perhaps from pre-DSGVO times. Use a number of tools, services and plugins that distribute the collected data across the Internet via various SaaS and cloud services. Include at most a simple cookie hint (of course, no content) and pray every day that the “problem” disappears by itself.

    Outlook: It will only be a matter of time before this method blows up in your face. The risks are increasing, as are the expected costs. Either because one is “caught” and high penalties beckon or because one is forced to take action at some point (reprimand from the state data protection officer, warning from competitors or consumer protection agencies, etc.). If the case occurs, one must act hastily. Everything is first of all “expanded” ad-hoc to avoid further damage. Consequence: A temporary blind flight occurs – then suddenly everything that one did not want to do must be implemented very quickly. This costs time and money and is guaranteed to cause problems and errors. Most likely, the reports then deviate from the old reports and the deviation cannot be explained. There are no comparative figures from an A/B test.

    This method is probably the most inefficient method.

  2. The “Minimum-Maximum” method
    This method is all about getting the maximum out of a situation with minimum effort. Basically you leave everything as it is and just put a content layer in front of your page. The cookie layer is of course optimized for maximum content rate and uses every trick there is to achieve this. You orientate yourself on all the others and use the minimum to avoid attracting attention and keep the actual state as long as possible. Everything stays as it is – well, well there is this annoying cookie layer – but everyone else has it too. The user is already annoyed anyway and will automatically click “Accept”. After that, everything runs as usual – you won’t be the first one to get it.

    Outlook: Sounds great, doesn’t it? In fact, this method does not exist. You can achieve neither one nor the other. It is the famous fight against the windmills. In the end you don’t solve any of the problems. Privacy blockers (ITP/ETP) are blocking more and more cookies and the conversion rates are decreasing noticeably from day to day. At the same time the risk increases – not necessarily that you will be the first to be “caught”, but that the one big change will come that everyone is afraid of. It doesn’t matter what it is – be it ITP version X, cookie layers that must have equivalent buttons, or a chrome browser update that brings something unpredictable. When the update is announced, you run out of time. All of a sudden this issue has the highest priority and all other projects stand still.

  3. The “roll up your sleeves” method
    This method takes a holistic approach to the topic. It analyzes which data is collected where, how and with which tools. An audit of each tool is then carried out. Are there European alternatives (e.g. privacy shield)? Can the data be anonymized or at least pseudonymized before it is transferred? Is a consensus necessary for the use and if so, what is the cost/benefit of the tool, also considering the data and the risk of attribution loss? Are there technical possibilities to implement the tracking even without standard implementation (keyword: server-side-tracking)? The goal of the audit is

    a) to minimize the risks (legal & organizational)
    b) to improve the attribution (technical & usability)

    Outlook: This method requires a very high initial effort. The collection and documentation of all tools is laborious. Developers also have to put a lot of effort into implementation – data flows have to be analyzed, APIs have to be revised, cronjobs have to be extended or adapted. Therefore, it is best to incorporate these tasks directly into the developer sprints, implementing them step-by-step.

    Should my hypothesis prove to be true and data protection really become “The next big thing”, one will be glad to have implemented this early and parallel to the current tracking setup. This way you can compare the figures with the actual figures very well and detect and understand deviations. On the other hand, if I am wrong with my assumption and the data protection discussion flattens out or is defused, one has admittedly probably only collected karma points and could have saved oneself this additional effort.

Where to start?

At everysize we started the changeover more than a year ago and have run various solutions in A/B or even A/B/C tests. Based on the lessons learned during this time, I would like to give the following recommendations for action:

  • Check UTM parameters and ensure that they are used everywhere.
    The point is actually generally known, but unfortunately it is still too often underestimated. What is much more important and new, however, is that the UTM parameters should also be stringently integrated into your own system landscape – i.e. not just be processed via Google Analytics. This includes calling up the UTM parameters yourself and saving them in a separate session/cookie and saving them accordingly when the funnel is completed (e.g. login, order, etc.). This can be useful later for offline tracking implementations.

  • Do not use tracking- or cookie switches OR implement them correctly.
    As already mentioned in “How the Google Tag Manager and cookie switches falsify tracking“. However, problems usually arise if the customer does not implement the necessary adjustments or does so too late. Therefore it is essential to check the current setup and, if you are working with a third party provider, ask for the latest updates.

  • Reorganize web analysis.
    The times when simply integrating a JavaScript from GTM and GA and everything is running are over. AdBlocker, ITP & ETP ensure that these are simply blocked or at least the cookies are cut. The requirements of the DSGVO and the effects of cookie layers are not even considered here. Better: Set up alternative tracking system. Either with on-promise solutions like Matomo or by server-side tracking via Google Analytics Measurement Protocol (German).

  • More server side tracking.
    Currently, server-side tracking is on everyone’s lips – often the interest in it is justified by the fact that one is no longer blocked by AdBlockers. But I think server-side tracking has an even bigger advantage: You get back control over your data! Even Google has recognized that the trend goes in this direction – it is not without reason that there is the possibility to implement GTM server-side. Markus Baersch provides instructions for this in his article “Server-side tagging with Google Tag Manager: First impressions (German)” (P.S. in his blog there are many good and detailed articles – if you are interested – read them!) Server-side tracking also works with Google Ads Conversion-Tracking (German) and many other services. So my tip: just check which of the used services also offers APIs that can be used instead of the standard JavaScript solution.

  • Reposition marketing channels and reinterpret campaign KPIs.
    Due to ever increasing tracking losses (see article 2-6 of this series of articles) it is no longer up-to-date to blindly trust the captured numbers 1:1. It is important to analyze the channels individually and also to subject them to a risk assessment. This includes, for example, the probability that the channel will break away or that the deviation is or will be affected to an above-average degree by the attribution problem. Let’s take retargeting as an example, which is known to be extremely affected by the elimination of third-party cookies. Already today, these cookies are blocked disproportionately often – in addition, a consensus is necessary on the website. Each blocking of the cookie thus leads to a loss of attribution. Of course, there are attribution losses also with other channels – but these are usually lower than those of retargeting. So it makes no sense to add 50% to each channel. Rather, each channel must be evaluated individually and receive its own “measurement loss markup”. Only then is it possible to compare the individual campaign successes and thus make a scaling decision.

  • Question your direct traffic.
    Actually an old shoe, but due to the importance and the many questions about “How much is normal?” again explicitly: A direct traffic share of >50% does not necessarily mean that you have an “extremely strong brand”. It means that you either have an extremely weak marketing or, more likely, an extremely strong tracking problem. This in turn means that the conversions that appear in Direct are largely attributable to other channels – the question is to which ones.

The early bird catches the worm

All recommendations listed in my contribution are of course not exhaustive. Rather, my aim is to show that the challenges for good tracking are increasing due to the data protection issue. There are currently a lot of construction sites, which result from different requirements, which also change regularly. Good planning is essential to avoid unnecessary extra work. From my own experience I know how time-consuming all this can be. During operation and that without “best practices” or “how-to’s” it is costly to acquire the knowledge. You have to try and test a lot and build a learning curve. Of course, you can also just wait until the big bang comes and then get active in the hope that you will find tutorials and ready-made solutions that you can quickly copy and paste. But it’s slightly different with this topic – the data protection discussion is actually progressing faster than technical progress. That’s why I think it’s important to be prepared and to put the attribution topic on the agenda instead of putting it off forever.

I hope the series of articles “Flying blind through online attribution” could help one or the other and bring the attribution topic a bit into focus. I am always happy to receive feedback, discussions, exchange of experiences and constructive criticism.

Author: Eugen

CTO & Co-Founder of everysize.com