Content Hub utilizes queues to facilitate the export and import of content from Drupal sites. To process the queues in the background, Acquia recommends using the scheduled jobs feature of Cloud Platform.
In Cloud Platform Enterprise and Site Factory, you can configure scheduled jobs for each environment through the Cloud Platform user interface. For more information, see Using scheduled jobs to support your application.
Important
- When using multisite on Cloud Platform Enterprise, ensure that you include the
–uri
or-l
option in the Drush command. This ensures that the command targets the correct site and allows you to effectively operate within the desired site environment. - Applications running on Cloud Next have a time limit of 1 hour for scheduled jobs. Content Hub publishers with large dependency trees may exceed that time limit during exports. This leaves the dependency cache in an incomplete and unstable state. Therefore, Acquia recommends you to use the
--time-limit
option of Drush command as a control measure.
Resource requirements
The following list describes how the queues used by Content Hub differ in their resource requirements:
- Export queues can be CPU intensive as they calculate dependencies for each entity during the export process.
- Import queues can be database intensive as they save multiple entities in the database.
The following are the additional resource requirements for Content Hub:
- The content structure and dependency tree in a site affect the resources needed to process the queues. The queues must consider all the dependencies that are implicated for export or import. If the content structure is highly complex or the number of dependencies is high, the system requires more resources to calculate dependencies during export and to create or update entities during import.
- A dedicated cron job server may be necessary for larger or complex deployments of Content Hub. For more information, contact your account manager.
Frequency of scheduled jobs
The frequency of scheduled jobs can be configured to meet specific business requirements. Processing the queues at higher frequencies minimizes the delay between exporting entities from a publisher and importing entities into a subscriber.
Note
Scheduled jobs on Cloud Next can only run once every 5 minutes. If you require a higher frequency, like every minute, you can use a wrapper script. Example wrapper scripts are included with Content Hub 3.3.0 and later.
To prevent multiple scheduled jobs from interfering with each other, Acquia recommends using Flock. Flock is a file-based process locking utility available on Cloud Platform.
Acquia recommends reviewing the wrapper script examples in the acquia contenthub module for version 3.3.0 or later, and adapting them as needed:
scripts/queue_cron_example.sh
: This Bash script executes a single instance of thedrush queue
command while using Flock to prevent concurrency issues.scripts/queue_cron_one_min.sh
: This Bash script implements a while loop to run thedrush queue
command every minute for a duration of 5 minutes. It also utilizes Flock to prevent concurrency. This script is specifically designed for use on Cloud Next, where per-minute scheduled jobs are not permitted.
Example values
The following table provides example values for the parameters used in scheduled jobs:
Example Value | Description | Notes |
---|---|---|
achex0.lck achim0.lck | Include a numerical index in your Flock files for both export queues achex and import queues achim to enable the use of multiple processors. | In the example values, replace 0 with the number that is applicable for your application. |
https://<mysite>.com | To ensure that absolute URLs are generated accurately, you must explicitly define the URI of the Drupal site in each scheduled job when using the Drush command. | In the example value, replace <mysite> with the URI of your Drupal site. |
Configuring an export queue processor
The export queue consists of the content from a publisher ready for syndication through the Content Hub service. To set up a scheduled job for processing the export queue on a subscriber, run the following command:
flock -xn /mnt/gfs/${AH_SITE_NAME}/files-private/achex0.lck -c "drush -l https://mysite.com -r /var/www/html/${AH_SITE_NAME}/docroot ach-export"
Configuring an import queue processor
The import queue consists of content from the Content Hub service ready to be created or updated on a subscriber. To set up a scheduled job for processing the import queue on a subscriber, run the following command:
flock -xn /mnt/gfs/${AH_SITE_NAME}/files-private/achim0.lck -c "drush -l https://mysite.com -r /var/www/html/${AH_SITE_NAME}/docroot ach-import"
Verifying queue processing
Logs related to import or export queues can be found in the Drupal watchdog logs or within the Publisher's Dashboard in the Logs section. Acquia recommends you to use the ach-export
and ach-import
Drush commands provided for cron jobs. These commands ensure that any Drush-specific messages are logged in the Drupal watchdog logs. When cron jobs must output the logs to files for debugging, you can append the command with the following:
For import queue:
&>> /var/log/sites/${AH_SITE_NAME}/logs/$(hostname -s)/achim0-`date "+\%F"`.log
For export queue:
&>> /var/log/sites/${AH_SITE_NAME}/logs/$(hostname -s)/achex0-`date "+\%F"`.log
Troubleshooting slow queue processing
If your queues are processing slowly, consider the following options to speed up the processing of queue items:
- Create more queue processors or queue runners per site by adding more scheduled jobs pointing to the same queue with different lock files. The included wrapper scripts offer an optional flag to specify an index
-i
, which can be used to define additional queue processors. - Distribute queue processors or queue runners across multiple servers. The number of scheduled jobs that can be added to each server depends on the size of the server, availability of PHP procs, and the complexity of the site content structure. You must analyze all these parameters to determine if it is correct to configure multiple processors per queue.
- Consider employing a dedicated cron server where all queue processing occurs. A dedicated cron job server may be necessary for larger or complex Content Hub deployments. For more information, contact your account manager.