Cloud Platform

About Cloud Platform logging

Cloud Platform uses multiple layers of infrastructure to support your Drupal application. Each layer generates its own log entries and stores them in its own log files.

You can access your application’s logs in Cloud Platform using several methods:

Available log files

Logs in Cloud Classic and Cloud Next

Cloud Platform makes the following logs available for your use through various methods, such as streaming, download, or log forwarding:

Log typeLog file nameDownloadStreamDescription
Apache access logsaccess.logContains a list of requests for your application that have bypassed Varnish®. These requests include pages, theme files, and static media files.
Apache error logserror.logRecords any Apache-level issues. The issues reported here are typically caused by general infrastructure issues, including capacity problems, .htaccess problems, and missing files.
Balancer logs 𐄂Provides information about requests received at the balancer level.
Drupal request logsdrupal-requests.logRecords all Drupal page loads on your application.
Drupal watchdog logsdrupal-watchdog.logRecords Drupal-related actions on your application. The watchdog log is recorded on your infrastructure if you have enabled the syslog module.
Fpm access logsfpm-access.log✓ (for Cloud Next)

𐄂 (for Cloud Classic)
Records all requests handled by Fpm’s process management in PHP.
Fpm error logsfpm-error.log✓ (for Cloud Next)

𐄂 (for Cloud Classic)
Records infrastructure-level issues with Fpm’s process management in PHP. For application-level PHP issues, see the PHP error log.
MySQL slow query logs 𐄂 (for Cloud Next)

✓ (for Cloud Classic)
Contains a list of MySQL queries that have taken longer than 1 second to complete. Since slow query logs are stored in a root-only MySQL directory on your infrastructure, you can only download them through the Cloud Platform user interface, but cannot access them directly on the infrastructure. For more information, see Downloading your slow query log and Tools for parsing a slow query log.
PHP error logsphp-errors.logRecords any issues that occur during the PHP processing portion of a page load, including issues caused by an application’s code, configuration, or content.
Varnish request logs 𐄂Records all requests processed by Varnish, both cached and uncached. Available only to subscriptions with dedicated load balancers that forward logs to an external service.

Logging in Site Factory

In addition to the log files mentioned on this page, Site Factory offers additional audit and task logging. For more information, see Monitoring Site Factory.

Logs only in Cloud Next

Cloud Next environments make the following logs available for your use through various methods, such as downloading or log forwarding:

Log typeLog file nameDownloadDescription
Cloud Hooks logscloud-hook.logRecords all messages sent to the standard output stream (STDOUT) during Cloud Platform Hook execution.
Scheduled Cron Jobs logscronjob.logRecords all messages sent to the standard output stream (STDOUT) during Cronjob execution.

Logs only in Cloud Classic

Cloud Classic environments make the following logs available for your use through various methods, such as streaming, download, or log forwarding:

Log typeLog file nameDownloadStreamDescription
Shell logsbash.log𐄂Records all commands that are executed by the application user. This includes SSH, cron, or user commands when executed in the context of the site.

 

Downloading active log files

To download active log files:

  1. Sign in to the Cloud Platform user interface as a user with the download logs permission for the subscription and environment you want.
  2. Select the application and environment for which you want to download a log.
  3. Select Logs > Download.

  4. Click the Download link for the log that you want to download.

    • If the log file is available for immediate download, the system begins downloading the file.
    • If the log file must be created on-demand, the system displays a dialog box with the status of the request:

  5. After the log file is ready for download, click Download to retrieve it.

    The system downloads the log file in the following format:

    logfilename-timestamp.tar.gz

Note

  • In Cloud Next: You can select any time duration for up to 24 hours. Acquia stores logs for 30 days.
  • In Cloud Classic: The Logs page lets you download logs for the current day.

Downloading historical logs

You can access or download historical logs by using the following:

  • In Cloud Classic: SSH, rsync, SCP, Cloud Platform user interface, or other tools.
  • In Cloud Next: Acquia CLI, Cloud Platform user interface or Cloud API v2. You can use SSH, rsync, and SCP to download custom log files written in a persistent directory such as /shared/logs. These logs are not automatically rotated. Therefore, you must periodically prune this directory to ensure that it does not consume bulk of the storage allocated to your subscription.

The Logs page allows you to select a specific date and time range when attempting to download a log file. Logs are limited to 24 hour increments. Downloading logs might take up to 30 seconds longer in Cloud Next as compared to Cloud Classic because of the remote log retention service, which ensures that logs persist when environments auto-scale.

Log files on Cloud Classic environments don’t persist after an infrastructure is relaunched or resized.

Historical logs are stored in a location on the infrastructure that’s optimized for fast read/write activity. While this works for actively and simultaneously updating several log files, the directory won’t persist after infrastructure is relaunched or resized. Log files do persist after infrastructure is rebooted. A relaunch can happen at any time. For example, in the event of infrastructure failure.

Note

Acquia retains subscriber infrastructure log data remotely for at least 12 months for internal compliance and security audit purposes. Since log files do not persist after infrastructure relaunches (which happen periodically during routine maintenance events), subscribers who require ongoing and persistent access to historical infrastructure logs, should use Acquia’s Log forwarding service.

Location of log files

  • Cloud Classic: A server maintains its own log files. For each server, the log file is located at /var/log/sites/[site].[env]/logs/[servername]/[logname].log. You can find your website name and server names on the Infrastructure page. For example, the Apache error log for the Dev environment of a website named myexample on a server named srv-25 is:

    /var/log/sites/myexample.dev/logs/srv-25/error.log

    Apart from Cloud Hooks logs and Scheduled Cron Jobs logs, all the logs listed in the Available log files section are available for download through the Cloud Platform user interface.

  • Cloud Next: A limited subset of logs generated by custom activities, such as cron jobs leveraging Acquia’s cron-wrapper.sh script, can be found in the /shared/logs directory. This directory is persistent and maps to your file system. Also, it can be used for all custom log files. All the logs listed in the Available log files section are available for download through the Cloud Platform user interface and cannot be accessed through SSH unlike Cloud Classic.

Archived logs

On Cloud Classic environments, Cloud Platform creates a new log file every day and compresses, archives, and saves old logs by date in the following format (YYYYMMDD):

  • access.log-20140527.gz
  • error.log-20140527.gz
  • drupal-requests.log-20140527.gz
  • drupal-watchdog-20140527.gz
  • php-errors.log-20140527.gz

Cloud Platform archives log files by file name, and won’t archive files in the logs directory if they don’t have the default names intact like those listed above.

Accessing logs using SSH

Note

In Cloud Next, all the logs listed in the Available log files section cannot be accessed through SSH, scp and rsync. However, if you write your custom or cron logs in one of the following directories, you can access such logs through SSH, scp and rsync

  • /shared/logs
  • $HOME
  • any persistent directory

For Cloud Classic, you can access log files on an infrastructure using SSH. For example, to view the Apache access log on the specified infrastructure:

less /var/log/sites/[site].[env]/logs/[servername]/access.log

Downloading web infrastructure logs using the rsync command

You can use the rsync command from your local machine to download logs:

  1. To list the log files on your web infrastructure, use the rsync command and substitute the correct values for your application, infrastructure, and Cloud Platform SSH key pathname (typically ~/.ssh/id_rsa):

    rsync -avz -e 'ssh -i /path/private/keyfile' [username]@[host name]:[logpath]
    

    The rsync command produces a list like the following example:

    receiving file list ... done
    drwxr-xr-x 4096 2010/01/26 19:34:05 .
    -rw-r--r-- 83581323 2010/01/27 12:05:53 access.log
    -rw-r--r-- 214919 2010/01/27 12:04:57 error.log
    -rw-r----- 995 2010/01/27 04:15:29 php-errors.log
    
  2. For each file you want to download, use the rsync command and substitute the correct values for your application, infrastructure, Cloud Platform SSH key, and the file you want to download:

    rsync -avz -e 'ssh -i /path/to/private/key/file' [site name]@[hostname]:[log path]/[file name].log
    

    For example:

    rsync -avz -e 'ssh -i /Users/esymbolist/.ssh/id_rsa' [email protected] /mnt/gfs/home/example/logs/access.log-20151225.gz/access.log
    

Downloading an individual infrastructure log using the SCP command

You can use the scp command from your local machine to individual logs. To download an individual infrastructure log, use the scp command and substitute the values for your application, infrastructure, Cloud Platform SSH key pathname (typically ~/.ssh/id_rsa), and the file you want to download:

scp -i /path/to/private/key/file [site name]@[host name]:[log path]/[filename].log [local path]

For example:

scp -i /Users/esymbolist/.ssh/id_rsa [email protected]:/mnt/gfs/home/example/logs/access.log-20151225.gz/access.log /Documents/logs

Additional (inaccessible) logs

Acquia uses additional logs from the logs mentioned earlier on this page that aren’t available to subscribers. These logs are inaccessible for several reasons, including the potential for security issues related to shared resources.

  • Binary logs: Binary log (binlog) files are used to replicate data from the master database to a replica database. All statements that change data, including create, update, and delete statements, add lines to the MySQL binlogs. Queries that only read from the database, without changing the contents, don’t add lines to the binlogs. Staging environments, despite not having redundant databases, also use binlogs to keep feature and performance parity between non-production and production environments. For more information, see Binlogs.
  • Email logs: Email logs are often a shared infrastructure resource, may have sensitive information, and aren’t available to subscribers.