New machine learning framework bridges the embodiment gap between robots and humans and enables quadrupedal robots to perform versatile movements like pouring soda, organizing shoes, and even cleaning up cat litter.
Get a robot to ~~help around the house~~ observe your daily schedule, your habits, your every movement, and upload video, audio, sonar, lidar and radar recordings to ~~the cloud~~ probably just an unesecured S3 bucket. And then use all that to profile you, sell you stuff, and send automatic reports to law enforcement about anything that triggers the AI as a possible indicator of criminal behavior.
Oh yeah, sign me right up for the corporate-controlled self-propelled surveillance platform. Maybe I'll get two, so there's never a gap in surveillance while one is recharging.
And if you think any of that sounds paranoid, you should be aware it's already happening with robot vacuums:
Small price to pay. They can watch me do my daily physical therapy, eat, play video games and watch TV. I hate to disappoint but I am not some secret agent hiding a bunch of shit. I also wouldn’t give them access to my internet or cell phone.
It's highly unlikely that this thing would be able to operate without an Internet connection. There's no way it would have enough compute power on board to do a significant amount of image recognition (find the socks, pick up the socks, find the laundry hamper, deposit the socks in the laundry hamper) or voice command processing.
I hate to disappoint but I am not some secret agent hiding a bunch of shit.
This is a very bad attitude to take towards your personal security, and part of the point I was trying to make is that there's a very high chance that a device like this would have poorly secured software. When you look at incidents like the multiple Wyze security camera breaches, you have to expect that consumer security is always an afterthought for companies that make these kind of products. They will only start to care about it after something goes wrong and gets public attention (because it threatens sales), after which they will make a token effort to fix the problem (just enough to get a headline saying they did, so that it will stop hurting sales). So, don't just think about the manufacturer/distributor having access to the surveillance data this thing will collect. Think about random people on the internet, a criminal with an interest in blackmailing people, or some random van driving by with a bunch of network gear on the back.
etherphon
in reply to Pro • • •like this
fistac0rpse likes this.
HertzDentalBar
in reply to etherphon • • •Darkenfolk
in reply to HertzDentalBar • • •HertzDentalBar
in reply to Darkenfolk • • •salty_chief
in reply to Pro • • •NaibofTabr
in reply to salty_chief • • •Get a robot to ~~help around the house~~ observe your daily schedule, your habits, your every movement, and upload video, audio, sonar, lidar and radar recordings to ~~the cloud~~ probably just an unesecured S3 bucket. And then use all that to profile you, sell you stuff, and send automatic reports to law enforcement about anything that triggers the AI as a possible indicator of criminal behavior.
Oh yeah, sign me right up for the corporate-controlled self-propelled surveillance platform. Maybe I'll get two, so there's never a gap in surveillance while one is recharging.
And if you think any of that sounds paranoid, you should be aware it's already happening with robot vacuums:
A Roomba recorded a woman on the toilet. How did screenshots end up on Facebook?
While it’s vacuuming your dirt, Roomba also collects data on you: Next, it could be sold
A Roomba recorded a woman on the toilet. How did screenshots end up on Facebook?
Eileen Guo (MIT Technology Review)salty_chief
in reply to NaibofTabr • • •NaibofTabr
in reply to salty_chief • • •It's highly unlikely that this thing would be able to operate without an Internet connection. There's no way it would have enough compute power on board to do a significant amount of image recognition (find the socks, pick up the socks, find the laundry hamper, deposit the socks in the laundry hamper) or voice command processing.
This is a very bad attitude to take towards your personal security, and part of the point I was trying to make is that there's a very high chance that a device like this would have poorly secured software. When you look at incidents like the multiple Wyze security camera breaches, you have to expect that consumer security is always an afterthought for companies that make these kind of products. They will only start to care about it after something goes wrong and gets public attention (because it threatens sales), after which they will make a token effort to fix the problem (just enough to get a headline saying they did, so that it will stop hurting sales). So, don't just think about the manufacturer/distributor having access to the surveillance data this thing will collect. Think about random people on the internet, a criminal with an interest in blackmailing people, or some random van driving by with a bunch of network gear on the back.
“So violated”: Wyze cameras leak footage to strangers for 2nd time in 5 months
Scharon Harding (Ars Technica)Corelli_III
in reply to Pro • • •