Be part of JetBrains PHPverse 2026 on June 9 – a free online event bringing PHP devs worldwide together.

jack27696's avatar

I need to make 1000’s of HTTP requests and I’m not sure what the best approach is

I need to build a service where the user can enter a list of domain names to monitor. This could run into the 1000s.

Periodically, and when first submitted by the user, the app should check the status of each website (to see if its hosting any content) and then check if there are any MX records associated with the domain.

I’m relativity new to this kind of thing and I’m not sure what the best approach would be.

I’ve looked at dispatching to a queue, pooling the request or simply using node rather than larval to harness async.

Any advice would be greatly appreciated.

0 likes
3 replies
LaryAI's avatar
Level 58

Handling thousands of HTTP requests efficiently requires a strategy that can manage high concurrency without overwhelming your server or the network. Here are a few approaches you can consider, especially focusing on using Node.js for its asynchronous capabilities, which are well-suited for handling multiple I/O operations concurrently.

1. Using Node.js with Async/Await and Promise.all

Node.js is particularly good for handling asynchronous operations. You can use Promise.all to handle multiple requests concurrently. However, be cautious with Promise.all as it will attempt to start all operations at once. For thousands of domains, you might want to chunk the requests to avoid hitting limits or overwhelming your server.

Here's a basic example using axios to make HTTP requests and node-fetch to check MX records:

const axios = require('axios');
const fetch = require('node-fetch');
const { Resolver } = require('dns').promises;
const resolver = new Resolver();

async function checkDomain(domain) {
    try {
        // Check if the website is hosting content
        const response = await axios.get(`http://${domain}`);
        const status = response.status;

        // Check for MX records
        const mxRecords = await resolver.resolveMx(domain);
        return { domain, status, mxRecords };
    } catch (error) {
        return { domain, status: 'Error', mxRecords: [], error: error.message };
    }
}

async function checkDomains(domains) {
    const promises = domains.map(domain => checkDomain(domain));
    return Promise.all(promises);
}

// Example usage
const domains = ['example.com', 'nonexistentdomain.com'];
checkDomains(domains).then(results => {
    console.log(results);
});

2. Using a Queue System

For a more robust solution, especially if this needs to be scaled or managed over time, consider using a queue system. This approach involves adding each domain check task to a queue, which are then processed by workers. This can be integrated into a Node.js application or even a Laravel application using Laravel Queues.

Here's a conceptual example using Bull (a Redis-based queue for Node.js):

const Queue = require('bull');
const checkDomainQueue = new Queue('checkDomain', 'redis://127.0.0.1:6379');

checkDomainQueue.process(async (job) => {
    return await checkDomain(job.data.domain);
});

domains.forEach(domain => {
    checkDomainQueue.add({ domain });
});

checkDomainQueue.on('completed', (job, result) => {
    console.log('Completed job', job.id, 'with result', result);
});

3. Rate Limiting

Regardless of the approach, consider implementing rate limiting to avoid hitting API limits or overwhelming the network. This can be done by chunking the requests and introducing a delay between chunks.

Conclusion

For handling thousands of requests, Node.js provides an efficient model due to its non-blocking nature. If using Laravel, consider leveraging its queue system to handle tasks asynchronously and distribute the load. Always monitor and adjust the concurrency and rate limits based on the capabilities of your server and the specifics of the APIs or services you are interacting with.

martinbean's avatar

I need to build a service where the user can enter a list of domain names to monitor.

@jack27696 There are plenty of services that let you do this already.

Please or to participate in this conversation.