Using .htaccess to restrict access

The cron job parameter --url wants a URL, correct?

Manually performing the cron job outputs errors, a fragment mentioning the exact HTML content of a 403 Forbidden page (see below). When I comment out the Require ip ... from the .htaccess file, the cron job works fine.

Error: Got invalid response from API request: ?module=API&method=API.get&idSite=2&period=day&date=last2&format=php&trigger=archivephp. Response was '<!DOCTYPE html> <html lang="en"> <head>   <meta charset="utf-8">   <meta http-equiv="x-ua-compatible" content="ie=edge">   <meta name="viewport" content="width=device-width, initial-scale=1, shrink-to-fit=no">   <title>403 Forbidden</title>   <link rel="stylesheet" href="/error_docs/styles.css"> </head> <body> <div class="page">   <div class="main">     <h1>Server Error</h1>     <div class="error-code">403</div>     <h2>Forbidden</h2>     <p class="lead">You do not have permission to access this document.</p>     <hr/>     <p>That's what you can do</p>     <div class="help-actions">       <a href="javascript:location.reload();">Reload Page</a>       <a href="javascript:history.back();">Back to Previous Page</a>       <a href="/">Home Page</a>     </div>   </div> </div> </body> </html>'

Hi,

The cron job does access the HTTP API, so I’d recommend you to look into the webserver log to see how exactly the rule is blocking the request (I know nothing about Apache, so I can’t be more specific)

Thanks Lukas, I’ve checked the Apache error_log for the domain and found that the HTTP API uses IP address 0.0.0.0, so I’ve changed my .htaccess to something like:

<Files "*">
        # Allow localhost IPv4 IPv6
        Require ip 127.0.0.1 ::1
        # Allow HTTP API
        Require ip 0.0.0.0
        # Allow admin IP
        Require ip <my_IP>
</Files>

# Allow public access to basic Matomo files
<FilesMatch "(^piwik\.(php|js)|^matomo\.(php|js)|^container_.*\.js|robots\.txt|optOut.js)">
        Require all granted
</FilesMatch>

Now it works without issues.

PS I probably won’t have to allow localhost, but it doesn’t hurt.

Hi @mezzomedia,

Please check that the requests are still blocked for other domains. At least on some systems 0.0.0.0 matches all IP addresses, so this might allow everyone to access the URLs.

I think I’m a bit lost with this right now. Can some please post the updated full .htaccess configuration for password restricted access that fulfills all requirements?

Allow external access to the files:

  • matomo.php
  • matomo.js
  • piwik.php
  • piwik.js
  • plugins/CoreAdminHome/javascripts/optOut.js
  • favicon.ico (and misc/user/favicon.png ?)
  • js/container_*.js
  • plugins/HeatmapSessionRecording/configs.php

Allow external access to the URL:

  • index.php?module=CoreAdminHome&action=optOut

Allow the cron job (/path/to/matomo/console core:archive --url=http://example.org/) to be run.

But maybe this is already too complicated. Why restrict access to all files except the listed ones and not to index.php with the shown exception only?

Something like this (untested):

<Files index.php>
	Order deny,allow
	Deny from all
	AuthType Basic
	AuthName "Matomo"
	AuthUserFile /path/to/.htpasswd
	Require valid-user
	Satisfy Any
        <If "(%{QUERY_STRING} =~ /^module\=CoreAdminHome\&action\=optOut/)">
          Require all granted
        </If>
</Files>

Because the idea is to add extra security to the sign in page only, or am I missing something here?

For anyone that has issues running the command with --url=https://example.com when you have a password setup on .htaccess you can check this https://matomo.org/faq/mobile-app/faq_16336/ and use the url with the password like: --url=https://user:password@example.com and the command will run without any errors.