An iframe from googlesyndication.com tries to access the camera and microphone
I think this sounds more like some sort of fingerprinting attempt. It good to see that random access to these kind of resources fails due to new(er) browser controls. However, this does not mean that the fingerprinting actually failed.
There is probably some way to determine if the request was denied automatically by the browser or manually by the user (e.g., time to get "response"), which is definitely something which can be used for fingerprinting.
Which reminds me of fingerprinting by tiny differences in the audio API provided by browsers [0]. Super interesting, but also a bit depressing. Also works for things like canvases and WebGL.
EFF allows you to check how fingerprintable your browser is [1]. Do note that the results may not be very accurate.
This is not google, but a third party ad network serving ads through google.
Google tries to sandbox the creatives in an attempt to prevent issues exactly like this, and develops browser features to prevent issues exactly like this.
This is likely a script that somehow avoided google's malware scanning pipelines.
This is definitely not google's malintent.
Disclaimer: Ex googler, worked in ads, dealed with problems like this all the time.
The author is concerned that an ad might be able to surreptitiously turn on the camera or microphone, but these are not accessible by default. In this case, it isn't even getting as far as a permissions prompt because the default Feature Policy doesn't allow camera or mic access in cross-origin iframes. (Ex, for Chrome: https://sites.google.com/a/chromium.org/dev/Home/chromium-se...) Instead, I think the most likely thing happening here is that an advertiser is running a script that is trying to do fingerprinting, and which is blocked by the browser protection.
(That it's an iframe running on https://[random].safeframe.googlesyndication.com tells us it's an ad served through Google Ad Manager, and the contents of the iframe are supplied by the advertiser.)
Disclosure: I work for Google, speaking only for myself
The Exhibit A why no one will ever convince me to turn off my ad blocker or switch away from Firefox. It's a great feeling to just not have to worry about this entire class of exploits.
I really don’t care what advertisers think is necessary to do in order to ensure targeting/fingerprinting or countering fraud, but running any third party script isn’t anything I consider remotely acceptable from an ad. Worst case I could accept that some generic script from the ad network is run - but for the ad network to pipe through the advertisers third party script should mean they are adblocked by the browser without even requiring a plugin.
Is this just click bait? I don't know the intricacies of Google's ad serving, but is this not just someone (e.g., an ads customer) slipping a request for camera and mic access into an ad script? But the title seems to suggest Google is doing something malicious here.
Shitty ad code barfing errors onto the console is typical, unfortunately. The JS is not written by Google, it's written by the individual advertiser, with very limited oversight.
Does googlesyndication.com serve anything that is in the user's interest? I've had that domain blocked for several years and don't think I've ever noticed it hindering any experience.
Quite ironic that the website posting the article actually has googlesyndication.com.
I wonder what would happen if you visited in a browser with it set to allow video/audio with no prompts? Would it bail out with a "oh crap, they actually let us" or would it actually try and do something?
I have to wonder if anti-fingerprinting is the wrong approach to privacy invading advertising. There’s an inherent asymmetry between the resources available to those who build these systems and those who try to stop them.
I’d love to see more stuff like CCPA. As a California resident I can simply tell Google that my data is not for sale, and they’re obligated to respect that regardless of what fingerprinting happens.
This isn’t an ideal solution, but the whole issue of privacy seems like a people/politics problem we keep trying to solve with technology.
On modern android devices there's a "Quick settings developer tiles" option called "Sensors Off" that's available after you enable developer mode.
After you enable that, a settings button will appear when you pull down your notification/settings menu for "Sensors Off". This disables the microphone, camera, fingerprint reader, accelerometer and other sensors.
I wonder if it's a fingerprinting attempt gone wrong? I can't see a reasonable even malicious reason for eavesdropping on mic/camera at scale like that, you'll be capturing a ton of data you need to manually process/clean up which takes time/resources, most people won't stay on the page long enough to capture enough sensitive info and even then, eavesdropped conversations seem pretty useless unless you also have the whole context and information on the person you're targeting to be able to effectively misuse that data.
This sounds like one small piece of common fingerprinting techniques.
It would have been nice to see the author address that possibility, but it seems fingerprinting is not mentioned.
The laptop camera can be disabled with small bit of black electrical tape, but I don't understand whose crazy idea it was to put microphones into laptops in the first place, especially without hardware kill switches like the Librems have. The same thing for modern cell phones, of course.
They recently changed their meet application for some reason. I use it without video, voice only; recently it has started trying to force me to use both - I have to refuse both, then once in the meet go and enable voice only. I loathe all things google but am forced to use meet for work.
There is extremely weak proof and all the crowd here has very predictable anti chrome response even though it has nothing to do with chrome. I think the domain is checking whether there is camera and mic present not turning it on and accessing content.
Well, at least that requires clicking (not to diminish this report) but five years ago it's a zero-click proposition (like Forbes: https://www.networkworld.com/article/3021113/forbes-malware-...). While I don't want to diminish their revenue, the fact that blocking online ads significantly strengthens your security posture is not lost to private companies and governments (like US CISA: https://www.cisa.gov/sites/default/files/publications/Capaci...) alike.
Viewing this on a Framework Laptop that just came in today and I'm laughing because it's the first laptop I've had that actually has hardware switches for camera and microphone.
We have a chance to notice this when it's mediated by a browser. Native apps don't barf all over the console and thus escape such scrutiny.
I first considered this when a friend told me about a brand of lawnmower of which I had never heard let alone searched (mowing lawns is the least interesting activity I can imagine), and one minute later a podcast app had a big banner at the top by which I could purchase a lawnmower of that exact brand. I don't have important conversations in the vicinity of mobile phones anymore.
Another reason to disable JS globally whilst doing heavy surfing and only temporarily whitelisting/enabling it on sites you trust.
A physical, hard-disconnect on-off switch for mic and camera should be required by law.
Serious question - why do we need iframes? Can't we disable them?
Injecting malware into ads is as old as ad networks. There are even ad networks that hijack the ads of other networks and replace the original ads with their own.
There are also many different types of clickjacking:
Best way to solve this would be, to have a physical switch that cuts power to webcam and microphone.... sadly I don't know any laptops who actually implement this.
Atleast some (eg. lenovo), have physical shutters to cover the webcam lens, so even if the cam turns on, it records only a piece of black plastic.
I don't think that it's google's fault. Google sometimes trade ads on auctions, meaning they issue and HTTP request to partners asking "Hey, you want to show an ad here", and partner respond with price and HTML code, the highest bidder wins and HTTP code is inserted.
HTTP contains JavaScript, and theoretically anything can be executed within the browser (I've seen people mining bitcoins!).
Google can't monitor an execute every HTML snippet, but they doing pretty great job sampling responses and evaluating some of them. Fraudsters are smart, and trying to understand if the code is executed on Google's servers, but overall they are loosing.
It seems like a case where google's system didn't work.
By they way, all google partners are listed here: https://developers.google.com/third-party-ads/adx-vendors. Usually, it's possible to track down who's exactly responsible by looking at dev console