Be part of JetBrains PHPverse 2026 on June 9 – a free online event bringing PHP devs worldwide together.

ciurinho's avatar

cURL error 60: SSL certificate problem when scraping google places with Goutte

The code below works perfectly for other websites, but when I try it to scrape data from google, I invariably get

cURL error 60: SSL certificate problem: unable to get local issuer certificate (see http://curl.haxx.se/libcurl/c/libcurl-errors.html)

I tried adding cacert.pem to php.ini as some laracaster recommended in another thread, but that didn't help either. Can this be done or is it Google's way of saying that I shouldn't try to scrape its data?

<?php  

namespace App\Http\Controllers;

use Illuminate\Http\Request;
use Goutte\Client;
use Symfony\Component\DomCrawler\Crawler;
use App\Http\Requests;

class WebScraperController extends Controller {
  public function index() {
     
    $client = new \Goutte\Client();

    $guzzleClient = new \GuzzleHttp\Client(array(
      'curl' => array(
          CURLOPT_TIMEOUT => 60,
      ),
    ));

    $client->setClient($guzzleClient);

    $urls = ["https://www.google.nl/search?q=restaurants+amsterdam&ie=utf-8&oe=utf-8&client=firefox-b-ab&gfe_rd=cr&ei=4praV6PwO6XA8gfj0rOwDA#q=restaurants+amsterdam&rflfq=1&rlha=0&rllag=52377091,4881360,1171&tbm=lcl&tbs=lf_msr:-1,lf:1,lf_ui:9"];

foreach ($urls as $url) {
        $crawler = $client->request('GET', $url);

     $crawler->filter('._rl')->each(function ($node) {
         print $node->text()."\n"; 
        });
    }
  }
}

0 likes
0 replies

Please or to participate in this conversation.