Content Hub utilizes queues to facilitate the export and import of content from Drupal sites. To process the queues in the background, Acquia recommends using the scheduled jobs feature of Cloud Platform.
In Cloud Platform Enterprise and Site Factory, you can configure scheduled jobs for each environment through the Cloud Platform user interface. For more information, see Using scheduled jobs to support your application.
–uri or -l option in the Drush command. This ensures that the command targets the correct site and allows you to effectively operate within the desired site environment.--time-limit option of Drush command as a control measure.The following list describes how the queues used by Content Hub differ in their resource requirements:
The following are the additional resource requirements for Content Hub:
The frequency of scheduled jobs can be configured to meet specific business requirements. Processing the queues at higher frequencies minimizes the delay between exporting entities from a publisher and importing entities into a subscriber.
Scheduled jobs on Cloud Next can only run once every 5 minutes. If you require a higher frequency, like every minute, you can use a wrapper script. Example wrapper scripts are included with Content Hub 3.3.0 and later.
To prevent multiple scheduled jobs from interfering with each other, Acquia recommends using Flock. Flock is a file-based process locking utility available on Cloud Platform.
Acquia recommends reviewing the wrapper script examples in the acquia contenthub module for version 3.3.0 or later, and adapting them as needed:
scripts/queue_cron_example.sh: This Bash script executes a single instance of the drush queue command while using Flock to prevent concurrency issues.scripts/queue_cron_one_min.sh: This Bash script implements a while loop to run the drush queue command every minute for a duration of 5 minutes. It also utilizes Flock to prevent concurrency. This script is specifically designed for use on Cloud Next, where per-minute scheduled jobs are not permitted.The following table provides example values for the parameters used in scheduled jobs:
| Example Value | Description | Notes |
|---|---|---|
achex0.lck achim0.lck | Include a numerical index in your Flock files for both export queues achex and import queues achim to enable the use of multiple processors. | In the example values, replace 0 with the number that is applicable for your application. |
https://<mysite>.com | To ensure that absolute URLs are generated accurately, you must explicitly define the URI of the Drupal site in each scheduled job when using the Drush command. | In the example value, replace <mysite> with the URI of your Drupal site. |
The export queue consists of the content from a publisher ready for syndication through the Content Hub service. To set up a scheduled job for processing the export queue on a subscriber, run the following command:
flock -xn /mnt/gfs/${AH_SITE_NAME}/files-private/achex0.lck -c "drush -l https://mysite.com -r /var/www/html/${AH_SITE_NAME}/docroot ach-export"
The import queue consists of content from the Content Hub service ready to be created or updated on a subscriber. To set up a scheduled job for processing the import queue on a subscriber, run the following command:
flock -xn /mnt/gfs/${AH_SITE_NAME}/files-private/achim0.lck -c "drush -l https://mysite.com -r /var/www/html/${AH_SITE_NAME}/docroot ach-import"
Logs related to import or export queues can be found in the Drupal watchdog logs or within the Publisher's Dashboard in the Logs section. Acquia recommends you to use the ach-export and ach-import Drush commands provided for cron jobs. These commands ensure that any Drush-specific messages are logged in the Drupal watchdog logs. When cron jobs must output the logs to files for debugging, you can append the command with the following:
For import queue:
&>> /var/log/sites/${AH_SITE_NAME}/logs/$(hostname -s)/achim0-`date "+\%F"`.log
For export queue:
&>> /var/log/sites/${AH_SITE_NAME}/logs/$(hostname -s)/achex0-`date "+\%F"`.logLog files generated by appending output directly to files do not automatically rotate and can rapidly fill the available disk space. Acquia recommends using Drupal's watchdog logging or the Publisher’s Dashboard logs for troubleshooting Content Hub queue processing. For more information, see guidelines on log rotation for customer-owned logs.
If your queues are processing slowly, consider the following options to speed up the processing of queue items:
-i, which can be used to define additional queue processors.If this content did not answer your questions, try searching or contacting our support team for further assistance.
Content Hub utilizes queues to facilitate the export and import of content from Drupal sites. To process the queues in the background, Acquia recommends using the scheduled jobs feature of Cloud Platform.
In Cloud Platform Enterprise and Site Factory, you can configure scheduled jobs for each environment through the Cloud Platform user interface. For more information, see Using scheduled jobs to support your application.
–uri or -l option in the Drush command. This ensures that the command targets the correct site and allows you to effectively operate within the desired site environment.--time-limit option of Drush command as a control measure.The following list describes how the queues used by Content Hub differ in their resource requirements:
The following are the additional resource requirements for Content Hub:
The frequency of scheduled jobs can be configured to meet specific business requirements. Processing the queues at higher frequencies minimizes the delay between exporting entities from a publisher and importing entities into a subscriber.
Scheduled jobs on Cloud Next can only run once every 5 minutes. If you require a higher frequency, like every minute, you can use a wrapper script. Example wrapper scripts are included with Content Hub 3.3.0 and later.
To prevent multiple scheduled jobs from interfering with each other, Acquia recommends using Flock. Flock is a file-based process locking utility available on Cloud Platform.
Acquia recommends reviewing the wrapper script examples in the acquia contenthub module for version 3.3.0 or later, and adapting them as needed:
scripts/queue_cron_example.sh: This Bash script executes a single instance of the drush queue command while using Flock to prevent concurrency issues.scripts/queue_cron_one_min.sh: This Bash script implements a while loop to run the drush queue command every minute for a duration of 5 minutes. It also utilizes Flock to prevent concurrency. This script is specifically designed for use on Cloud Next, where per-minute scheduled jobs are not permitted.The following table provides example values for the parameters used in scheduled jobs:
| Example Value | Description | Notes |
|---|---|---|
achex0.lck achim0.lck | Include a numerical index in your Flock files for both export queues achex and import queues achim to enable the use of multiple processors. | In the example values, replace 0 with the number that is applicable for your application. |
https://<mysite>.com | To ensure that absolute URLs are generated accurately, you must explicitly define the URI of the Drupal site in each scheduled job when using the Drush command. | In the example value, replace <mysite> with the URI of your Drupal site. |
The export queue consists of the content from a publisher ready for syndication through the Content Hub service. To set up a scheduled job for processing the export queue on a subscriber, run the following command:
flock -xn /mnt/gfs/${AH_SITE_NAME}/files-private/achex0.lck -c "drush -l https://mysite.com -r /var/www/html/${AH_SITE_NAME}/docroot ach-export"
The import queue consists of content from the Content Hub service ready to be created or updated on a subscriber. To set up a scheduled job for processing the import queue on a subscriber, run the following command:
flock -xn /mnt/gfs/${AH_SITE_NAME}/files-private/achim0.lck -c "drush -l https://mysite.com -r /var/www/html/${AH_SITE_NAME}/docroot ach-import"
Logs related to import or export queues can be found in the Drupal watchdog logs or within the Publisher's Dashboard in the Logs section. Acquia recommends you to use the ach-export and ach-import Drush commands provided for cron jobs. These commands ensure that any Drush-specific messages are logged in the Drupal watchdog logs. When cron jobs must output the logs to files for debugging, you can append the command with the following:
For import queue:
&>> /var/log/sites/${AH_SITE_NAME}/logs/$(hostname -s)/achim0-`date "+\%F"`.log
For export queue:
&>> /var/log/sites/${AH_SITE_NAME}/logs/$(hostname -s)/achex0-`date "+\%F"`.logLog files generated by appending output directly to files do not automatically rotate and can rapidly fill the available disk space. Acquia recommends using Drupal's watchdog logging or the Publisher’s Dashboard logs for troubleshooting Content Hub queue processing. For more information, see guidelines on log rotation for customer-owned logs.
If your queues are processing slowly, consider the following options to speed up the processing of queue items:
-i, which can be used to define additional queue processors.If this content did not answer your questions, try searching or contacting our support team for further assistance.