cybrkyd

PHP hit counter

 Wed, 20 Aug 2025 16:28 UTC

Leading on from my previous post Counting unique webpage visitors, I have made a hit counter. The idea was not to use Javascript. Ad blockers and browsers will block a Javascript hit counter once discovered as it falls squarely into tracker territory. Pixel tracking is soooo sneaky and I’m not a fan. And no cookies, ever.

That left me with PHP for server-side tracking, which is similar to how the raw stats are captured on-server. This can then be easily passed to a flat file database and pulled into each page where I want to show a hit count.

I also do not want to count “hits” in the classic sense; I would like to log actual unique page visits.

My requirements

  1. No database, no cookies.
  2. Privacy-respecting, so hash the IP addresses.
  3. Ensure that bots and serial page refreshers are kept at bay.
  4. Self-cleaning (no CRON).

Note: This setup is unique to my Hugo-generated website. I only want to see the hit numbers on actual blog post pages, not index.html or other pages, but I will explain below how to use this everywhere.

The logic

This hit counter records how many times a specific page is visited (/post/some-post-name/) on a website. It counts each unique visitor only once per hour, therefore repeated visits or refresh-happy people don’t inflate the count inside of 60-minutes.

PHP hit counter script usage

  1. The CSS consists on only one line:

    .hit-counter{font-size:0.7em;color:#666;text-align:center;margin-top:1em}
    
  2. For the /post pages (as per my Hugo setup), the HTML and on-page script sits in my /layouts/_default/single.html file:

    {{ if eq .Type "post" }}
    <div class="hit-counter">
      Visitors: <span id="hit-counter">
        {{ if .Site.IsServer }}[Local]{{ else }}Loading...{{ end }}
      </span>
    </div>
    {{ if not .Site.IsServer }}
    <script>
    document.addEventListener('DOMContentLoaded', function() {
      {{ if not .Site.IsServer }}
      if (window.location.pathname.includes('/post/')) {
        fetch(`/hits.php?path=${encodeURIComponent(window.location.pathname)}`)
          .then(response => response.json())
          .then(data => {
            document.getElementById('hit-counter').textContent = data.count || '0';
          });
      }
      {{ end }}
    });
    </script>
    {{ end }}{{ end }}
    

    If not using Hugo, the on-page HTML and script to add is:

    <div class="hit-counter">
      Visitors: <span id="hit-counter">Loading...</span>
    </div>
    <script>
    document.addEventListener('DOMContentLoaded', function() {
      fetch(`/hits.php?path=${encodeURIComponent(window.location.pathname)}`)
        .then(response => response.json())
        .then(data => {
          document.getElementById('hit-counter').textContent = data.count || '0';
        })
        .catch(error => {
          console.error('Error fetching hit count:', error);
          document.getElementById('hit-counter').textContent = '0';
        });
    });
    </script>
    
  3. Here is my final production hit counter, hits.php which lives in the website root:

    <?php
    $data_file = __DIR__.'/data/visits.json';
    $salt = "My_Salty_Passphrase";
    $path = $_GET['path'] ?? '';
    
    // Validate path (only allow /post/slug/ format)
    if (!preg_match('~^/post/[a-z0-9\-]+/$~i', $path)) {
        header('Content-Type: application/json');
        echo json_encode(['error' => 'invalid_path']);
        exit;
    }
    
    // Generate anonymous IP hash (8 chars)
    $ip_hash = substr(hash('sha256', $_SERVER['REMOTE_ADDR'] . $salt), 0, 8);
    $current_time = time();
    $time_window = 3600;
    $time_key = floor($current_time / $time_window) * $time_window;
    
    // Read and update data (with file locking)
    $lock = fopen($data_file, 'c+');
    flock($lock, LOCK_EX);
    $data = file_exists($data_file) ? json_decode(file_get_contents($data_file), true) : [];
    
    // Initialize path data if missing
    if (!isset($data[$path])) {
        $data[$path] = ['total' => 0];
    }
    
    // Count only if IP hash is new for this 60min window
    if (!isset($data[$path][$time_key])) {
        $data[$path][$time_key] = [];
    }
    
    if (!in_array($ip_hash, $data[$path][$time_key])) {
        $data[$path]['total']++;
        $data[$path][$time_key][] = $ip_hash;
    }
    
    // Cleanup old time windows (keep last 10 days)
    foreach ($data[$path] as $key => $val) {
        if ($key !== 'total' && $key < ($current_time - 864000)) {
            unset($data[$path][$key]);
        }
    }
    
    // Save data
    file_put_contents($data_file, json_encode($data));
    flock($lock, LOCK_UN);
    fclose($lock);
    
    // Return count
    header('Content-Type: application/json');
    echo json_encode(['count' => $data[$path]['total']]);
    ?>
    

    If counting hits site-wide, simply change this:

    $path = $_GET['path'] ?? '';
    // Validate path (only allow /post/slug/ format)
    if (!preg_match('~^/post/[a-z0-9\-]+/$~i', $path)) {
        header('Content-Type: application/json');
        echo json_encode(['error' => 'invalid_path']);
        exit;
    }
    

    to this:

    $path = $_GET['path'] ?? '';
    $path = filter_var($path, FILTER_SANITIZE_URL);
    if (empty($path)) {
        header('Content-Type: application/json');
        echo json_encode(['error' => 'empty_path']);
        exit;
    }
    
  4. Permissions ! Important

    Make sure, check and double-check that hits.php and visits.json are set to 640 (rw-r-----). Additionally, make sure that /data is set to 750 (rwxr-x---).

    There is no reason whatsoever that Joe Public should have access to any of those files or the data directory.

Again, why?

I have not reinvented the wheel! What I’m trying to do here is to have an old school hit counter, just because. I actively use ad blockers on everything and since this counter is a first-party tracker (not a third-party one), it is more likely to accurately record genuine page visits and not be blocked. Oh, bots will still be recorded as visitors; I’m not interested in blocking crawlers and AI scrapers as I don’t like playing whack-a-mole. My robots.txt is mostly blank in that regard. But, the script limits the aggressive ones with the 60-minute rule.

Secondly, I do not want to use third-party tools which absolutely do not respect anyone’s privacy, despite their claims. Read above: this script hashes IP addresses and those hashes are stored for a maximum of 10 days. I can un-hash them if I choose to but why, when I can just open my AWStats to find the un-hashed IP addresses! I do IP-hashing in case I spring a leak and someone manages to scrape the data.

All my requirements have been satisfied. I’m satisfied, finally.

PS: I still see you, but hashed! 😇️

»
Tagged in: #PHP

Visitors: Loading...