Without seeing your metrics it’s pretty hard to gauge.
Keep in mind that those may not be people! There is HUGE numbers of automated bots crawling the internet daily. As an experiment (which is not running anymore, I finally deleted it) in 2010 I put a few sites out using various domain names, and various hosts, the top level of the domain was simply a blank white page. I then placed links to content below the top level in various places around the web linking back. 4 years later many of those sites were still getting daily traffic from a bot or two, or in one case a botnet seemed to visit everyday.
While you would hope that your ad provider would filter that kinda of traffic before reporting the metrics back to you, there are really two problems: First, The provider WANTS to tell you that you are getting hits, if you are not why would you want to keep buying $5.00 ads. Second, sometimes it’s really hard to identify automated vs human traffic. Sure everyone can easily identify the googlebot when it comes a crawling, but there is much more legitimate/illegitimate automated traffic than the normal user ever imagines.