PHP中是否有一种方法可以使HTTP调用不等待响应?我不关心响应,我只想做一些类似file_get_contents()的事情,但不等待请求完成后再执行其余的代码。这对于在我的应用程序中触发某种“事件”或触发长进程非常有用。

什么好主意吗?


当前回答

您可以使用这个库:https://github.com/stil/curl-easy

这很简单:

<?php
$request = new cURL\Request('http://yahoo.com/');
$request->getOptions()->set(CURLOPT_RETURNTRANSFER, true);

// Specify function to be called when your request is complete
$request->addListener('complete', function (cURL\Event $event) {
    $response = $event->response;
    $httpCode = $response->getInfo(CURLINFO_HTTP_CODE);
    $html = $response->getContent();
    echo "\nDone.\n";
});

// Loop below will run as long as request is processed
$timeStart = microtime(true);
while ($request->socketPerform()) {
    printf("Running time: %dms    \r", (microtime(true) - $timeStart)*1000);
    // Here you can do anything else, while your request is in progress
}

下面你可以看到上面例子的控制台输出。 它将显示简单的实时时钟,指示请求正在运行的时间:


其他回答

Symfony HttpClient是异步的https://symfony.com/doc/current/components/http_client.html。

例如,你可以

use Symfony\Component\HttpClient\HttpClient;

$client = HttpClient::create();
$response1 = $client->request('GET', 'https://website1');
$response2 = $client->request('GET', 'https://website1');
$response3 = $client->request('GET', 'https://website1');
//these 3 calls with return immediately
//but the requests will fire to the website1 webserver

$response1->getContent(); //this will block until content is fetched
$response2->getContent(); //same 
$response3->getContent(); //same

让我告诉你我的路。

需要在服务器上安装nodejs

(我的服务器发送1000个HTTPS请求只需要2秒)

url.php:

<?
$urls = array_fill(0, 100, 'http://google.com/blank.html');

function execinbackground($cmd) { 
    if (substr(php_uname(), 0, 7) == "Windows"){ 
        pclose(popen("start /B ". $cmd, "r"));  
    } 
    else { 
        exec($cmd . " > /dev/null &");   
    } 
} 
fwite(fopen("urls.txt","w"),implode("\n",$urls);
execinbackground("nodejs urlscript.js urls.txt");
// { do your work while get requests being executed.. }
?>

urlscript.js >

var https = require('https');
var url = require('url');
var http = require('http');
var fs = require('fs');
var dosya = process.argv[2];
var logdosya = 'log.txt';
var count=0;
http.globalAgent.maxSockets = 300;
https.globalAgent.maxSockets = 300;

setTimeout(timeout,100000); // maximum execution time (in ms)

function trim(string) {
    return string.replace(/^\s*|\s*$/g, '')
}

fs.readFile(process.argv[2], 'utf8', function (err, data) {
    if (err) {
        throw err;
    }
    parcala(data);
});

function parcala(data) {
    var data = data.split("\n");
    count=''+data.length+'-'+data[1];
    data.forEach(function (d) {
        req(trim(d));
    });
    /*
    fs.unlink(dosya, function d() {
        console.log('<%s> file deleted', dosya);
    });
    */
}


function req(link) {
    var linkinfo = url.parse(link);
    if (linkinfo.protocol == 'https:') {
        var options = {
        host: linkinfo.host,
        port: 443,
        path: linkinfo.path,
        method: 'GET'
    };
https.get(options, function(res) {res.on('data', function(d) {});}).on('error', function(e) {console.error(e);});
    } else {
    var options = {
        host: linkinfo.host,
        port: 80,
        path: linkinfo.path,
        method: 'GET'
    };        
http.get(options, function(res) {res.on('data', function(d) {});}).on('error', function(e) {console.error(e);});
    }
}


process.on('exit', onExit);

function onExit() {
    log();
}

function timeout()
{
console.log("i am too far gone");process.exit();
}

function log() 
{
    var fd = fs.openSync(logdosya, 'a+');
    fs.writeSync(fd, dosya + '-'+count+'\n');
    fs.closeSync(fd);
}

swoole扩展。https://github.com/matyhtf/swoole PHP的异步并发网络框架。

$client = new swoole_client(SWOOLE_SOCK_TCP, SWOOLE_SOCK_ASYNC);

$client->on("connect", function($cli) {
    $cli->send("hello world\n");
});

$client->on("receive", function($cli, $data){
    echo "Receive: $data\n";
});

$client->on("error", function($cli){
    echo "connect fail\n";
});

$client->on("close", function($cli){
    echo "close\n";
});

$client->connect('127.0.0.1', 9501, 0.5);

使用CURL设置低CURLOPT_TIMEOUT_MS来模拟请求中止 设置ignore_user_abort(true)在连接关闭后继续处理。

使用这种方法,不需要通过头文件和缓冲区来实现连接处理,这太依赖于操作系统,浏览器和PHP版本

主进程

function async_curl($background_process=''){

    //-------------get curl contents----------------

    $ch = curl_init($background_process);
    curl_setopt_array($ch, array(
        CURLOPT_HEADER => 0,
        CURLOPT_RETURNTRANSFER =>true,
        CURLOPT_NOSIGNAL => 1, //to timeout immediately if the value is < 1000 ms
        CURLOPT_TIMEOUT_MS => 50, //The maximum number of mseconds to allow cURL functions to execute
        CURLOPT_VERBOSE => 1,
        CURLOPT_HEADER => 1
    ));
    $out = curl_exec($ch);

    //-------------parse curl contents----------------

    //$header_size = curl_getinfo($ch, CURLINFO_HEADER_SIZE);
    //$header = substr($out, 0, $header_size);
    //$body = substr($out, $header_size);

    curl_close($ch);

    return true;
}

async_curl('http://example.com/background_process_1.php');

后台进程

ignore_user_abort(true);

//do something...

NB

如果希望cURL在不到一秒的时间内超时,可以使用 CURLOPT_TIMEOUT_MS,尽管在类unix上有一个bug/“特性” 如果值为<,则会导致libcurl立即超时。 1000毫秒,错误“cURL错误(28):超时已达”。的 这种行为的解释是: […] 解决方案是使用CURLOPT_NOSIGNAL禁用信号

资源

卷曲超时小于1000毫秒总是失败? http://www.php.net/manual/en/function.curl-setopt.php#104597 http://php.net/manual/en/features.connection-handling.php

您可以使用exec()来调用一些可以执行HTTP请求的东西,如wget,但必须将程序的所有输出指向某个地方,如文件或/dev/null,否则PHP进程将等待该输出。

如果你想把进程和apache线程完全分开,可以尝试这样做(我不确定,但我希望你能明白):

exec('bash -c "wget -O (url goes here) > /dev/null 2>&1 &"');

这不是一项很好的业务,您可能需要类似cron作业的东西来调用heartbeat脚本,该脚本轮询实际的数据库事件队列来执行真正的异步事件。