If you want 100 jobs to run simultaneously you need to set up 100 workers. Each worker runs jobs one after another. Something tells me this will not satisfy you.
Maybe you can gather numbers in some temporary table and push them by batches of 100 entries in one job (or better scheduled task) using HTTP client concurrency: https://laravel.com/docs/12.x/http-client#concurrent-requests
Scheduled task can run every minute and check how many numbers are there in temp table. If there are more than 100 then take first 100 numbers and push to external API. If less - then exit.
Another approach is not to wait for API response in a job - just push a number to API, get task id (API must provide it for long-running tasks so you can request task result later) and exit. This way your hundreds of jobs will be processed quickly. If you need to fetch some result from API using task id, you can do it in subsequent chained job. I use this approach with API that takes video files and returns transcripts after some significant time - works fine.
Task id returned by API is stored by "send" job and retrieved by "check result" job using context: https://laravel.com/docs/12.x/context
"Check result" job releases back to queue using $this->release(60) when result is not ready yet; when the result is ready it exits normally. This way a worker is never stuck in useless waiting for API response.