Advertisement

Advertisement

hand job

[ hand job ]

noun

, Slang: Usually Vulgar.
  1. an act of masturbation, especially as performed by a sexual partner on a man.


Discover More

Word History and Origins

Origin of hand job1

First recorded in 1935–40

Advertisement

Advertisement

Advertisement

Advertisement