I have been tasked with the job of bringing in products from an external supplier database for clients to purchase on our website.
I am wondering if it’s best to just make a remote request to their API from a product page (every time its loaded) and displaying/filtering products that way.
Or if I should just set up a cron job, that each morning looks for new/updated products and creates them as products locally in our site.
If I did this scrape method, I think I would still run a GET request on the product page (single.php) for the stock level just to ensure that the product was still in stock/stock amount at the suppliers end. As this could vary throughout the day.
​
Would love some feedback on how you would do it?
[ad_2]
>I am wondering if it’s best to just make a remote request to their API from a product page (every time its loaded) and displaying/filtering products that way.
Why waste the resources of the 3rd party API every time a product page on your site gets a visitor? What about the delay added to your page load to get, parse and display data from the API?