Ideas

Treasure Data's primary idea portal. 

Submit your ideas & feature requests directly to our product requirements team! We look forward to hearing from you.

When 100-over jobs are paralleized, fit for the PerfectQueue's limit.

Customer has about 300-over td jobs ( with data connector ).

So, they want to finish job quickly.

They want to use parallel execution all job.

But, Perfectqueue's limit is 96(default).

Therefore, Max-queue limit error has occured.

 

As is:

The customer split job-group (about 50-jobs) manually.

To be:

When many-jobs( over 100 ) are parallelized, wait for queueing by Workflow.

  • Atsushi Kurumada
  • Apr 20 2017
  • Likely to Implement
Active Requests?
Product Component Workflow Core
  • Jul 27, 2017

    Admin Response

    While we wait for a more comprehensive fix, we have increased our PerfectQueue limit to 256 items, up from a little under 100 items. We hope this will limit issues in the meantime, but we do believe this is an important fix for us to make in the future.

  • Atsushi Kurumada(demo) commented
    April 20, 2017 10:02

    supplement:
    A parent-task has 300 child-task ( with "_parallel: True").

     

    Implementation Image(Example):

    At first, check the DataConnector's concurrency.

    As a result, DataConnector's concurrency is 3. 

    When 300 parallel child-task has called, child-task-iimitation sets 3.

    When 1 child-task finished, next child-task execution start.(Loop, until all child-task finished)

  • Keisuke Noda commented
    May 2, 2017 08:41

    Similar to this request. If we could set a limitation for each parallel execution, it would help to avoid such unexpected queues.