In a previous article (link at bottom of page) the benefits of using Google Data Studio to do a bit of data mining on which pages the "blank" query actually pointed to could be done in an effort to extract at least a bit more intelligence on search behaviour.
Just a few days after posting the article, Google decided to obliterate the blank query in the Google Search console dataset stating user privacy.
For web sites with a large share of query being "blank", it results in loss of quite a bit of data and insight.
In fact the result is opposite to what Google stated (We believe that omitting anonymous queries from all query-filtered results is more consistent), in fact it means the data is inconsistent as data is now missing.
First the loss of search phrases in the referrer string (resulting in the "not provided" tsunami), now the loss of the blank query. What next?
If a site had a large share of blank query in their search query data, then there is less data available as of 19th August 2018 to make use of in Data Studio.
Using Data Studio the cut off can be visualized, simply filter on the query data field and exclude all other query strings other than the blank one.
This is the way data loss looks in a chart, as of 19SEP2018 there is no longer data...
Once visualized some rather interesting spikes stand out as rather odd. Such spikes seem to be the result of automated query usage, i.e. bots are hitting on Google causing the Search Console data to raise questions.
Rather than deal with the issue, Google decides it better to hide the blank query data.
Perhaps Google can't handle the obvious cardinality problem the blank queries cause, end result is we all lose valuable data due to this choice and thus limits the useability and data reliability of the Search Console.
This is something any user of the Search Console needs to be aware of, the reports are in fact a reduced set of the total data set.