Acquia Content Hub enables you to export and import your content in bulk, making it easier to transfer your initial content from one website to another. The Content Hub 2.x module leverages Drupal queues to export and import content for your sites. Acquia recommends that you configure scheduled jobs, also known as cron jobs, to ensure that the export and import queues are always processed in the background.
–uri
or -l
to identify the correct site from the mutisite
fleet.For Content Hub, you can set scheduled jobs by using flock. Flock ensures that only one process runs at a time for every scheduled job. Therefore, you can run a schedule job every minute. You can define the frequency to run the queues according to your needs.
If the queues take long to run or get slow down because of multiple items produced, consider the following options to ensure that queue items run faster:
If the scheduled jobs are created using flock in the following format, you can set the job to execute every minute. Every instance running at any time has a single process.
Assuming the following:
Publisher URL: https://publisher-site.com
Subscriber URL: https://subscriber-site.com
Acquia Cloud SiteGroup: <account>
Environment: <env>
(For example, 01dev
, 01test
, 01live
)
The number in the lock files and log files in each command must be
different between scheduled jobs in the same environment. If not, some jobs
might not run. For example, achim0.lck
, achex0.log
, achi1.log
.
In some cases, multiple environments run on the same web node, and there can be conflicts of lock files and log files. For example, the dev and stage environments run on the same server.
In such a case, you might use the following script:
sitegroup=mysitegroup
for env in dev stg prod; do
echo "cron list for $env:"
acli api:environments:cron-job-list $sitegroup.$env | jq '.[] | .command'|grep acquia_contenthub
done
Acquia recommends assigning different lock and log IDs to each environment running on the same hardware. To use the preceding script, leverage Acquia CLI.
flock -xn /tmp/achim0.lck -c "drush -l https://mysite.com/ -r /var/www/html/${AH_SITE_NAME}/docroot -dv queue:run acquia_contenthub_subscriber_import" &>> /var/log/sites/${AH_SITE_NAME}/logs/$(hostname -s)/achim0-`date "+\%F"`.log
Here, achim
stands for Acquia Content Hub import.
Note
To delete items in the import queue, you can click Purge on the Import
Queue page. This deletes all the items in
acquia_contenthub_subscriber_import
.
flock -xn /tmp/achex0.lck -c "drush -l https://mysite.com/ -r /var/www/html/${AH_SITE_NAME}/docroot -dv queue:run acquia_contenthub_publish_export" &>> /var/log/sites/${AH_SITE_NAME}/logs/$(hostname -s)/achex0-`date "+\%F"`.log
Here, achex
stands for Acquia Content Hub export.
Note
The preceding commands are compatible with the Scheduled Jobs user interface, which runs them through the cron interface on the servers. However, if you want to execute the commands in the command line in order to ensure that they work when running from the scheduled job, ensure that you remove the backslash (\) in front of the percent sign (%). This is because percent signs (%) have special meaning in crontabs. For more information, see escaping double quotes and percent signs in cron.
To verify that your Content Hub queues are processed, you can view the Drush logs by using the command:
cat /var/log/sites/${AH_SITE_NAME}/logs/$(hostname -s)/achex0-`date "+%F"`.log | grep -v "using pcntl_exec" | grep -v "Processed 0 items from"
Sample output:
[success] Processed 1 items from the acquia_contenthub_publish_export queue in 0.84 sec.
[success] Processed 4 items from the acquia_contenthub_publish_export queue in 1.23 sec.