PHP中是否有一种方法可以使HTTP调用不等待响应?我不关心响应,我只想做一些类似file_get_contents()的事情,但不等待请求完成后再执行其余的代码。这对于在我的应用程序中触发某种“事件”或触发长进程非常有用。
什么好主意吗?
PHP中是否有一种方法可以使HTTP调用不等待响应?我不关心响应,我只想做一些类似file_get_contents()的事情,但不等待请求完成后再执行其余的代码。这对于在我的应用程序中触发某种“事件”或触发长进程非常有用。
什么好主意吗?
当前回答
/**
* Asynchronously execute/include a PHP file. Does not record the output of the file anywhere.
*
* @param string $filename file to execute, relative to calling script
* @param string $options (optional) arguments to pass to file via the command line
*/
function asyncInclude($filename, $options = '') {
exec("/path/to/php -f {$filename} {$options} >> /dev/null &");
}
其他回答
这是我自己的PHP函数,当我做POST到任何页面的特定URL .... 示例:my Function的***用法…
<?php
parse_str("email=myemail@ehehehahaha.com&subject=this is just a test");
$_POST['email']=$email;
$_POST['subject']=$subject;
echo HTTP_POST("http://example.com/mail.php",$_POST);***
exit;
?>
<?php
/*********HTTP POST using FSOCKOPEN **************/
// by ArbZ
function HTTP_Post($URL,$data, $referrer="") {
// parsing the given URL
$URL_Info=parse_url($URL);
// Building referrer
if($referrer=="") // if not given use this script as referrer
$referrer=$_SERVER["SCRIPT_URI"];
// making string from $data
foreach($data as $key=>$value)
$values[]="$key=".urlencode($value);
$data_string=implode("&",$values);
// Find out which port is needed - if not given use standard (=80)
if(!isset($URL_Info["port"]))
$URL_Info["port"]=80;
// building POST-request: HTTP_HEADERs
$request.="POST ".$URL_Info["path"]." HTTP/1.1\n";
$request.="Host: ".$URL_Info["host"]."\n";
$request.="Referer: $referer\n";
$request.="Content-type: application/x-www-form-urlencoded\n";
$request.="Content-length: ".strlen($data_string)."\n";
$request.="Connection: close\n";
$request.="\n";
$request.=$data_string."\n";
$fp = fsockopen($URL_Info["host"],$URL_Info["port"]);
fputs($fp, $request);
while(!feof($fp)) {
$result .= fgets($fp, 128);
}
fclose($fp); //$eco = nl2br();
function getTextBetweenTags($string, $tagname) {
$pattern = "/<$tagname ?.*>(.*)<\/$tagname>/";
preg_match($pattern, $string, $matches);
return $matches[1];
}
//STORE THE FETCHED CONTENTS to a VARIABLE, because its way better and fast...
$str = $result;
$txt = getTextBetweenTags($str, "span"); $eco = $txt; $result = explode("&",$result);
return $result[1];
<span style=background-color:LightYellow;color:blue>".trim($_GET['em'])."</span>
</pre> ";
}
</pre>
我之前接受的答案行不通。它仍在等待回应。这虽然工作,采取从我如何使一个异步GET请求在PHP?
function post_without_wait($url, $params)
{
foreach ($params as $key => &$val) {
if (is_array($val)) $val = implode(',', $val);
$post_params[] = $key.'='.urlencode($val);
}
$post_string = implode('&', $post_params);
$parts=parse_url($url);
$fp = fsockopen($parts['host'],
isset($parts['port'])?$parts['port']:80,
$errno, $errstr, 30);
$out = "POST ".$parts['path']." HTTP/1.1\r\n";
$out.= "Host: ".$parts['host']."\r\n";
$out.= "Content-Type: application/x-www-form-urlencoded\r\n";
$out.= "Content-Length: ".strlen($post_string)."\r\n";
$out.= "Connection: Close\r\n\r\n";
if (isset($post_string)) $out.= $post_string;
fwrite($fp, $out);
fclose($fp);
}
如果你控制了你想要异步调用的目标(例如你自己的"longtask.php"),你可以从那端关闭连接,两个脚本将并行运行。它是这样工作的:
quick.php通过cURL打开longtask.php(这里没有魔法) php关闭连接并继续(神奇!) 当连接关闭时,cURL返回quick.php 两项任务同时进行
我试过了,效果不错。但是quick.php不会知道longtask.php正在做什么,除非您在进程之间创建一些通信方式。
在执行其他操作之前,请先在longtask.php中尝试这段代码。它将关闭连接,但仍然继续运行(并抑制任何输出):
while(ob_get_level()) ob_end_clean();
header('Connection: close');
ignore_user_abort();
ob_start();
echo('Connection Closed');
$size = ob_get_length();
header("Content-Length: $size");
ob_end_flush();
flush();
代码复制自PHP手册的用户贡献注释,并进行了一些改进。
swoole扩展。https://github.com/matyhtf/swoole PHP的异步并发网络框架。
$client = new swoole_client(SWOOLE_SOCK_TCP, SWOOLE_SOCK_ASYNC);
$client->on("connect", function($cli) {
$cli->send("hello world\n");
});
$client->on("receive", function($cli, $data){
echo "Receive: $data\n";
});
$client->on("error", function($cli){
echo "connect fail\n";
});
$client->on("close", function($cli){
echo "close\n";
});
$client->connect('127.0.0.1', 9501, 0.5);
我发现这个软件包非常有用,非常简单:https://github.com/amphp/parallel-functions
<?php
use function Amp\ParallelFunctions\parallelMap;
use function Amp\Promise\wait;
$responses = wait(parallelMap([
'https://google.com/',
'https://github.com/',
'https://stackoverflow.com/',
], function ($url) {
return file_get_contents($url);
}));
它将并行加载所有3个url。 您还可以在闭包中使用类实例方法。
例如,我使用Laravel扩展基于这个包https://github.com/spatie/laravel-collection-macros#parallelmap
这是我的代码:
/**
* Get domains with all needed data
*/
protected function getDomainsWithdata(): Collection
{
return $this->opensrs->getDomains()->parallelMap(function ($domain) {
$contact = $this->opensrs->getDomainContact($domain);
$contact['domain'] = $domain;
return $contact;
}, 10);
}
它在10个并行线程中加载所有需要的数据,而不是在没有异步的情况下50秒,它在8秒内完成。