Datastream 10k tables limitation

Hello,

The Datastream FAQ states the following:

 

How many tables can I include in a single stream?Datastream can handle up to 10,000 tables in a single stream. There's no technical reason to split a database into multiple streams. There might be some business considerations to use different streams for different logical flows, but each stream adds load on the source database. Such load is negligible for CDC, but can be significant for backfill.
The limitation pages for each source specify that streams can have up to 10,000 tables. Does this mean that Datastream can't run CDC operations in parallel for more than 10,000 tables at same time?No. The limit mentioned is per stream. You can have multiple streams with the total number of tables exceeding 10,000 tables.

To what does this limitation refer?

I did some tests where I created a Datastream and in `source configuration` in the `objects to include` property I placed a single pattern where more than 10000 tables were selected.

Both the Backfill and CDC processes worked for all 10000+ tables. No errors appeared in the logs.

I tried doing the opposite, informing each of the names of my more than 10000 tables in `objects to include`. I noticed that the screen stops working/saving when the number of tables gets close to 5500 items. The same occurred when trying to update the datastream via the java client.

This made me wonder what the limit of 10000 tables mentioned in the FAQ would be. Is it about the number of items that can be entered in the datastream's `objects to include` property or about the maximum number of tables that the source can have? The FAQ doesn't make this 100% clear.

If the limit is in the source. Why did my test with more than 10000 tables work? What kind of error messages should I expect in the logs?

 

0 0 170