Site icon TechNewsBoy.com

MIT makes an AI smart carpet for monitoring people without cameras | ZDNet

Researchers at MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) have come up with a way to use carpets to monitor humans without using privacy invading cameras. 

The so-called intelligent carpet could have applications in personalized healthcare, smart homes, and gaming. It also might offer a more privacy-friendly way of delivering healthcare to people who need to be remotely monitored by healthcare professionals. 

As MIT CSAIL notes, other research in this field has relied on devices like wearable cameras, and webcams.

MIT’s system only uses cameras to creating a dataset that was used to train the AI model. The neural network uses sensors in the carpet to determine if the person is doing sit-ups, stretching, or other actions. 

“You can imagine leveraging this model to enable a seamless health monitoring system for high-risk individuals, for fall detection, rehab monitoring, mobility, and more,” says Yiyue Luo, a lead author on a paper about the carpet. 

MIT’s focus is on 3D human pose estimation using pressure maps recorded by a tactile-sensing carpet. 

“We build a low-cost, high-density, large-scale intelligent carpet, which enables the real-time recordings of human-floor tactile interactions in a seamless manner,” the researchers note in a new paper.

The researchers’ intelligent carpet measured 36 square feet and included an integrated tactile sensing array consisting of over 9,000 pressure sensors that can embedded on the floor. It also included readout circuits to allow real-time recordings of humans interacting with the carpet. 

They called over 1.8 million synchronized tactile and visual frames for 10 people performing a diverse carious activities, such as lying, walking, and exercising. 

The sensors on the carpet convert the human’s pressure into an electrical signal, through the physical contact between people’s feet, limbs, torso, and the carpet, according to MIT CSAIL.

The researchers trained the system using tactile and visual data, such as a video and corresponding heatmap of someone doing a pushup. The AI model uses this visual data as the ground truth and uses the a person’s pressure on the carpet to create various 3D human poses, so it can produce an image or video of a person doing a certain action on the carpet without actually recording the person carrying out the action. 

“You may envision using the carpet for workout purposes. Based solely on tactile information, it can recognize the activity, count the number of reps, and calculate the amount of burned calories,” says Yunzhu Li, one of the paper’s authors.

For all the latest Technology News Click Here 

 For the latest news and updates, follow us on Google News

Read original article here

Denial of responsibility! TechNewsBoy.com is an automatic aggregator around the global media. All the content are available free on Internet. We have just arranged it in one platform for educational purpose only. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials on our website, please contact us by email – abuse@technewsboy.com. The content will be deleted within 24 hours.
Exit mobile version