Setup - Delivery endpoint
The delivery endpoint uses cached items, so you have to set up a command which calculates these cached items on a
regular basis. If you have set the save hook to
Deliver directly on save you can omit this step, but
it is recommended to set up the process because you can rebuild the cache table with it.
The command can be setup and executed directly as cronjob by just executing the
datahub:export:productsup command (for details
--help option) or can be setup via ProcessManger.
Execution via Process Manager
To set up the calculation job you just have to add a
Pimcore Command job with the command
- In the
Command optionsfield you can provide the DataHub configuration name/id as
--config-id="products-up. If no
config-idis provided, all configurations of type
productsupwill be exported.
- By default, a full export is done. If you only want to export the changed items you can add the option
- If you have a lot of data to export, you can use the multiprocessing option. Therefore you can add the parameter
processesdefines the max number of processes which should be executed simultaneously.
batch-sizesdefines the amount of items which should be processed per process.