How Google Actually Cares About Your Privacy Unlike Apple?

So just like every other tech event this year Google’s main areas of focus was on privacy.
They always mentioned we are in control of our privacy.
but why do tech companies suddenly care so much about our privacy?

So this problem started with the leakage of Facebook data last year,where user data was intentionally shared with advertisers which lead to a huge violation of users privacy.

Therefore the world became aware of this problems and now websites were required to add privacy measures and they had to become more open regarding their privacy practices.

Now coming to 2019, Apple in their Keynote mentioned that they care about our privacy and everything is encrypted,but there was no information given on how they were protected.
There are even cases like in the case of apple card they monitor the user spending habits and generate results from them,So this is not at all possible to do just on-device the data is needed to be sent to their servers in encrypted for to generate the results,this contrary to their main focus on privacy, this is just another Apple Stunt with some terms & conditions.

So now how Google changes everything?

images 20

So in the Google I/O 2019, Google introduced device machine learning.
But wait wasn’t it already present?
Yes it was already present there but the model have to be downloaded from the servers and again uploaded to the servers and as the data models are hundreds of GB’s so obviously it is impossible to download them therefore the data is ought to be processed on the internet.
But this year google revolutionized everything by compressing that 100Gb’s of data to just 50Mb which means that each device can easily download the data, process it,update it and send the model to the server.
No data from the device is sent only the updated model is transferred.
Therefore reducing all the chance of data privacy violation.

But wait there is even more suprise!!!

So first we talked about privacy but what extra would we be getting due to this?
The presence of these models in our own device means that we won’t require internet to use Google translate, Google assistant and any other Google services where any kind of machine learning is used.
Therefore the moment we get these support in our devices we won’t require internet to process out voice to text which will lead to ultra fast Google assistant calls and we would require to use touch less often.

Check out this demo at the Google IO 2019.

Thanks for reading!

Related Articles

Leave a Reply

Back to top button
The Tech Infinite