Setup - Delivery endpoint

The delivery endpoint uses cached items, so you have to set up a command which calculates these cached items on a regular basis. If you have set the save hook to Deliver directly on save you can omit this step, but it is recommended to set up the process because you can rebuild the cache table with it.

The command can be setup and executed directly as cronjob by just executing the datahub:export:productsup command (for details see --help option) or can be setup via ProcessManger.

Execution via Process Manager

To set up the calculation job you just have to add a Pimcore Command job with the command datahub:export:productsup.

Export process

overview

  • In the Command options field you can provide the DataHub configuration name/id as config-id parameter, e.g. --config-id="products-up. If no config-id is provided, all configurations of type productsup will be exported.
  • By default, a full export is done. If you only want to export the changed items you can add the option --only-queue-items.
  • If you have a lot of data to export, you can use the multiprocessing option. Therefore you can add the parameter --processes=5 --batch-size=50.
    • processes defines the max number of processes which should be executed simultaneously.
    • batch-sizes defines the amount of items which should be processed per process.
Loggers

loggers