Some state that counting uniques is meaningless, given that a typical user today uses multiple devices. Sure, unique visitors is out of the question if the uniqueness of the visitor isn't fully enforced in order to deduplicate multiple unique browsers.
But one can still count unique browsers and if doing so do it in a smart manner. The first step is to eliminate the problems with 3rd party cookies, given issues such as they are blocked or just simply wiped when browsers shut down/ends visit means there is a notable inaccuracy which causes inflation when using them.
The 1st image shows that using 3rd party cookies a site has on a weekly basis had more new weekly browsers than returning browsers. Is this really the case? Using a 1st party cookie to track unique browsers on the same data set makes it evident just how unreliable 3rd party cookies are.
However, it is crucial to remember that when tracking browsers across domains 3rd party cookies are the default.
The comparison of the same dataset (3rd vs 1st party cookies) makes it evident that many of the "new" cookies when using 3rd party cookies are in fact returning visitors which isn't any surprise. The 3rd party cookie simply didn't stick!
Given the notable inaccuracy, any unique browser number generated on a web site based on 3rd party cookies will be inaccurate straight out of the gate. Since Christmas occurred in the middle of the reporting period in this exampel normally the conclusion would be, when looking at image 1, that this was because of new devices visiting.
This could account for a share of the "new" browsers, but that share isn't possible to verify. The comparison however reveals the inflation of 3rd party cookies. With the gap between total unique browsers and returning unique browsers reduced the unique browser count is far more reliable than before.
It is important to keep in mind that using 1st party cookies will in no way guarantee that cookie inflation is eliminated, it is still a part of the inflated cookie count but reduced. One way to dampen that impact is to study the cookie age.
If the bulk of cookies on your site a given are less than 2 weeks old then cookie deletion is an issue that needs to be recognised and addressed.
On a typical media site, with ~1 million unique browsers per week, an average percentage of returning browsers is 50 - 70% when using a 1st party cookie. If you have less than that then cookie issues are skewing your data causing inflation.
One way to limit that impact is to use "waterline" unique browser. That is the unique browser value half way between returning unique browsers (reliable value) and total unique browsers (maximum value). The use of waterline unique browsers is however very likely not to start happening, reporting lower numbers than previously done is normally very disturbing when the demand for greater and greater numbers is required. This is why deduplication is a hard sell as it too will show lower numbers.
The question to ask those who oppose more reliable numbers is quite simple. Is using inaccurate numbers to any benefit for driving your business or for your advertisers? It has been repeatedly said, using 3rd party cookies to track browsers on a site has not been a good web analytics practice for quite a while.