We have a searchable list of events in the region on http://visitbirmingham.com/explore-birmingham/ a significant proportion of the data driving this feature comes from The Press Association who provide a set of XML files that can be downloaded from their FTP site and they keep updated on a regular schedule. We also add data to Explore directly on occasion.
The Current Process
This is a simplified diagram of the process as it is currently. There is a console application on an local server that is awakened every night by a scheduled task. this checks the FTP site for any files, downloads them and begins parsing the content. This is converted in to CRM Custom Entities and loaded in though the CRM Organization Service.
There is a plugin associated to the create and update messages that the Asynchronous Service calls. This plugin then converts the CRM Custom Entities to another XMLDocument and transmits this offer http to an endpoint on VisitBirmingham.com where the data is processed and added to the sites Content Management System.
The main issues with this process is that it frequently triggers duplicate update messages and there is no rate limiting capability which can cause the Asynchronous Service to effectively Denial-Of-Service our own website and that we are moving everything to the cloud where all this is on-premise.
The New Process
The inner workings of the new process aren’t going to change very much, just where they live and how they are triggered.
The plugin and messaging will be completely removed and replaced with another WebJob running on a schedule that queries CRM for records to create and update – a date field on the entity will be set when it is uploaded and this compared to the modified date to identify affected records. It is at this point that we will add in rate limiting too.