Trouble with wild lockdown locks? Engineer and part-time YouTuber Shane Wighton figured it would be fun to use his free time, rather casually really, to make a robot hairdresser to tame the mane at home—something many of us wish he had attempted before lockdown and our hair crisis.
Wighton is known for his YouTube channel Stuff Made Here, where he posts videos on the process of his life hack inventions, including his robotic barber—the ultimate hairdresser, programmed exactly to his specifications. We’ve all faced the vanity repercussions of lockdown, and most of us will never take hair salons for granted again, having been shut for months. Wighton has his own reasoning behind his motivation, he “would rather not have someone cut my hair who’s also touching 100 other people’s heads all day long.”
Wighton realised it was time to take matters into his own hands, so he built the robot. Why not, if you’re a practiced engineer? The robot uses scissors, rather than clippers, which adds to the complexity of the device. Would you really trust a robot to swing scissors that close to your ears, eyes and face? He did. His wife on the other hand did not.
So how does it work? The robot sections out chunks of hair, much like a hairdresser would finger out sections to trim in real life using a comb and hands. But robots aren’t exactly known for their intuition skills, which is why Wighton had to first implement the data for the specific haircut he wanted to go for to exact digits to get the wanted results. With some trial and error, Wighton decided in the end that a little hoover would work better than a robotic arm to do the combing—which took care of the cleaning up too. Win win.
The big issue the engineer faced was getting the robot to know where his head was and where to cut. He added an Intel RealSense Depth Camera, where he used facial recognition to find where his head was, and then combined that with the depth data to figure out where his head was within a 3D space. Once this problem was solved, Wighton had to think about how exactly he would tell a robot what haircut he wants.
Wighton programmed a 3D model of a head, and ‘painted’ onto the model what haircut he wanted using a gradient. In his case, the lighter grey was longer, and darker grey was shorter. The robot loaded the 3D model and figured out how many locations, and at what length to cut the sections at, as well as calculating how many sections this will take to complete the whole look.
In the live video of the robot’s first try, Wighton looked mortified. He admitted he was terrified of it ruining his hair. The whole haircut took the robot an hour to complete, only because he found a bug in his code mid-process, which told the robot to cut six times as many locations than necessary on his head. In theory, it should have taken the robot just 15 minutes to cut Wighton’s hair.
With a few minor defects that Wighton plans on perfecting in the future, the engineer didn’t quite end up with the hair style he put in to the system, getting instead a ‘robotically perfect mullet’. The verdict? Not too bad, he’s got a month to grow a new set of locks while he tweaks the robot and tries again. If he perfects the device, who knows what the future of hairdressing will look like?