In the past, we worried about our webcams and microphones being used to invade our privacy. Now, even our keyboards are at risk. People who use laptops might have their private information like messages, passwords, and credit card details stolen by a new method – just by the sound of their typing.
A group of British university researchers discovered that artificial intelligence can listen to the sound of typing and figure out what is being typed.
They found that AI can do this with 95% accuracy. This is a big problem because AI is getting better and better.
Here’s how it works: bad actors use another device, like a phone next to a laptop, or a microphone on a video call, to listen to the sounds of typing. Then, they use AI to understand the unique sounds of each key being pressed. This lets them know what text is being typed.
The researchers were able to use this method to figure out what was typed on a MacBook Pro by just listening to the sounds with a phone nearby. They were right 95% of the time. They also did this with a Zoom call and were correct 93% of the time.
To protect yourself, you can use strong passwords with both upper and lower case letters. This makes it harder for AI to understand what you’re typing. Using random combinations of letters, numbers, and symbols is better than using real words in your passwords.
You can also use extra security steps like two-factor authentication and biometric verification. But the researchers warn that as AI gets smarter, it might be able to bypass these security methods too.