Information for: DEVELOPERS   PARTNERS   SUPPORT

About Cloud Platform logging

Cloud Platform uses multiple layers of infrastructure to support your Drupal application. Each layer generates its own log entries and stores them in its own log files.

You can access your application’s logs on Cloud Platform using several methods:

Available log files

Cloud Platform makes the following logs available for your use through various methods, such as streaming, download, or log forwarding:

Log type Cloud Next only Log file name Download Stream Description
Apache access logs 𐄂 access.log Contains a list of requests for your application that have bypassed Varnish®. These requests include pages, theme files, and static media files.
Apache error log 𐄂 error.log Records any Apache-level issues. The issues reported here are typically caused by general infrastructure issues, including capacity problems, .htaccess problems, and missing files.
Cloud Hooks logs cloud-hook.log 𐄂 Records all messages sent to the standard output stream (STDOUT) during Cloud Platform Hook execution.
Drupal request log 𐄂 drupal-requests.log Records all Drupal page loads on your application.
Drupal watchdog log 𐄂 drupal-watchdog.log Records Drupal-related actions on your application. The watchdog log is recorded on your infrastructure if you have enabled the syslog module.
FPM access logs 𐄂 fpm-access.log 𐄂 Records all requests handled by FPM’s process management in PHP.
FPM error logs 𐄂 fpm-error.log 𐄂 Records infrastructure-level issues with FPM’s process management in PHP. For application-level PHP issues, see the PHP error log.
PHP error logs 𐄂 php-errors.log Records any issues that occur during the PHP processing portion of a page load, including issues caused by an application’s code, configuration, or content.
MySQL slow query log 𐄂   Contains a list of MySQL queries that have taken longer than one second to complete. Since slow query logs are stored in a root-only MySQL directory on your infrastructure, you can only download them using the Logs page, and you can’t access them directly on the infrastructure. For more information, see Downloading your slow query log and Tools for parsing a slow query log.
Scheduled Cron Jobs cronjob.log 𐄂 Records all messages sent to the standard output stream (STDOUT) during Cronjob execution.
Varnish request logs 𐄂   𐄂 𐄂 Records all requests processed by Varnish, both cached and uncached. Available only to subscriptions with dedicated load balancers that forward logs to an external service.

Logging in Site Factory

In addition to the log files mentioned on this page, Site Factory offers additional audit and task logging. For more information, see Monitoring Site Factory.

Downloading active log files from the Logs page

You can download the most recent logs for each of an environment’s infrastructure from the Logs page using the following steps:

Note

For Cloud Next technologies, Acquia stores logs for 30 days. You can download log data of 24 hours anytime.

  1. Sign in to the Cloud Platform user interface as a user with the download logs permission for the subscription and environment you want.

  2. Select the application for which you want to download a log.

  3. Select the environment for which you want to download a log.

  4. Select Logs > Download.

    Downloading logs

  5. Click the Download link for the log you want to download. If available for immediate download, the log file will begin downloading; if the log file must be created on-demand, a dialog will display with the status of the request:

    Preparing log for download

  6. Once the log file is ready for download, click Download to retrieve it.

The log file will be downloaded as logfilename-timestamp.tar.gz.

For environments running on Cloud Next technologies, this page provides additional functionality not available for Cloud Classic environments. Before attempting to download a specific log, select the date and starting time and click Download. This will automatically generate a file with up to 24 hours worth of log data based on the date and start time you’ve selected.

On Cloud Classic environments, you can only download logs for the current day using this interface.

Downloading historical logs directly from the infrastructure

For environments running on Acquia’s Cloud Classic infrastructure, you can access logs using SSH, rsync, SCP, or other tools. This functionality is not available for environments running on Cloud Next, and either Acquia CLI, the Cloud Platform user interface or Cloud API v2 should be used to access historical logs instead. For historical logs on Cloud Next, see Downloading active log files from the Logs page.

The Download Logs page allows you to select a specific date and time range when attempting to download a log file. Logs are limited to 24 hour increments. Downloading logs may take up to 30 seconds longer on Cloud Next compared to Cloud Classic. This is due to the introduction of a remote log retention service to ensure logs persist when environments auto-scale.

Log files on Cloud Classic environments don’t persist after an infrastructure is relaunched or resized.

Historical logs are stored in a location on the infrastructure that’s optimized for fast read/write activity. While this works for actively and simultaneously updating several log files, the directory won’t persist after infrastructure is relaunched or resized. Log files do persist after infrastructure is rebooted. A relaunch can happen at any time. For example, in the event of infrastructure failure.

For environments running on Cloud Next technologies, the directory /shared/logs can be used for custom log files, as well as for cron logs. These logs are not automatically rotated, however, so you must periodically prune this directory to keep it from consuming too much of the storage allocated to your subscription.

Note

Acquia retains subscriber infrastructure log data remotely for at least 12 months for internal compliance and security audit purposes. Since log files do not persist after infrastructure relaunches (which happen periodically during routine maintenance events), subscribers who require ongoing and persistent access to historical infrastructure logs, should use Acquia’s Log forwarding service.

Location of log files

For environments running on Acquia’s Cloud Classic infrastructure, an server maintains its own log files. For each server, the log file is located at /var/log/sites/[site].[env]/logs/[servername]/[logname].log. You can find your website name and server names on the Infrastructure page. For example, the Apache error log for the Dev environment of a website named myexample on a server named srv-25 would be:

/var/log/sites/myexample.dev/logs/srv-25/error.log

For environments running on Cloud Next technologies, a limited subset of logs generated by custom activities, such as cron jobs leveraging Acquia’s cron-wrapper.sh script, can be found in the /shared/logs directory. This directory is persistent and maps to your file system, and it can be used for all custom log files.

Archived logs

On Cloud Classic environments, Cloud Platform creates a new log file every day and compresses, archives, and saves old logs by date in the following format (YYYYMMDD):

  • access.log-20140527.gz
  • error.log-20140527.gz
  • drupal-requests.log-20140527.gz
  • drupal-watchdog-20140527.gz
  • php-errors.log-20140527.gz

Cloud Platform archives log files by file name, and won’t archive files in the logs directory if they don’t have the default names intact like those listed above.

Accessing logs using SSH

Note

This section does not apply to environments running on Cloud Next technologies.

You can access log files on an infrastructure using SSH. For example, to view the Apache access log on the specified infrastructure:

less /var/log/sites/[site].[env]/logs/[servername]/access.log

Downloading web infrastructure logs using the rsync command

You can use the rsync command from your local machine to download logs:

  1. To list the log files on your web infrastructure, use the rsync command and substitute the correct values for your application, infrastructure, and Cloud Platform SSH key pathname (typically ~/.ssh/id_rsa):

    rsync -avz -e 'ssh -i /path/private/keyfile' [username]@[host name]:[logpath]
    

    The rsync command produces a list like the following example:

    receiving file list ... done
    drwxr-xr-x 4096 2010/01/26 19:34:05 .
    -rw-r--r-- 83581323 2010/01/27 12:05:53 access.log
    -rw-r--r-- 214919 2010/01/27 12:04:57 error.log
    -rw-r----- 995 2010/01/27 04:15:29 php-errors.log
    
  2. For each file you want to download, use the rsync command and substitute the correct values for your application, infrastructure, Cloud Platform SSH key, and the file you want to download:

    rsync -avz -e 'ssh -i /path/to/private/key/file' [site name]@[hostname]:[log path]/[file name].log
    

    For example:

    rsync -avz -e 'ssh -i /Users/esymbolist/.ssh/id_rsa' [email protected] /mnt/gfs/home/example/logs/access.log-20151225.gz/access.log
    

Downloading an individual infrastructure log using the SCP command

You can use the scp command from your local machine to individual logs. To download an individual infrastructure log, use the scp command and substitute the values for your application, infrastructure, Cloud Platform SSH key pathname (typically ~/.ssh/id_rsa), and the file you want to download:

scp -i /path/to/private/key/file [site name]@[host name]:[log path]/[filename].log [local path]

For example:

scp -i /Users/esymbolist/.ssh/id_rsa [email protected]:/mnt/gfs/home/example/logs/access.log-20151225.gz/access.log /Documents/logs

Additional (inaccessible) logs

Acquia uses additional logs from the logs mentioned earlier on this page that aren’t available to subscribers. These logs are inaccessible for several reasons, including the potential for security issues related to shared resources.

  • Binary logs: Binary log (binlog) files are used to replicate data from the master database to a replica database. All statements that change data, including create, update, and delete statements, add lines to the MySQL binlogs. Queries that only read from the database, without changing the contents, don’t add lines to the binlogs. Staging environments, despite not having redundant databases, also use binlogs to keep feature and performance parity between non-production and production environments. For more information, see Binlogs.
  • Email logs: Email logs are often a shared infrastructure resource, may have sensitive information, and aren’t available to subscribers.