You can use "php://input" to accept and parse "PUT", "DELETE", etc. requests.
<?php
// Example to parse "PUT" requests
parse_str(file_get_contents('php://input'), $_PUT);
// The result
print_r($_PUT);
?>
(very useful for Restful API)
PHP には多くの組み込みラッパーが用意されています。さまざまな URL 風のプロトコルによって、 fopen() や copy()、 file_exists() そして filesize() といったファイルシステム関数で使用することができます。 これらのラッパーだけでなく、 stream_wrapper_register() 関数でラッパーを自作することもできます。
注意: ラッパーを指定する際の URL 構文としてサポートしているのは
scheme://...
形式のみです。scheme:/
やscheme:
といった形式には対応していません。
You can use "php://input" to accept and parse "PUT", "DELETE", etc. requests.
<?php
// Example to parse "PUT" requests
parse_str(file_get_contents('php://input'), $_PUT);
// The result
print_r($_PUT);
?>
(very useful for Restful API)
Example of how to use the php://input to get raw post data
//read the raw data in
$roughHTTPPOST = file_get_contents("php://input");
//parse it into vars
parse_str($roughHTTPPOST);
if you do readfile("php://input") you will get the length of the post data
The contants:
* STDIN
* STDOUT
* STDERR
Were introduced in PHP 4.3.0 and are synomous with the fopen('php://stdx') result resource.
to create a raw tcp listener system i use the following:
xinetd daemon with config like:
service test
{
disable = no
type = UNLISTED
socket_type = stream
protocol = tcp
bind = 127.0.0.1
port = 12345
wait = no
user = apache
group = apache
instances = 10
server = /usr/local/bin/php
server_args = -n [your php file here]
only_from = 127.0.0.1 #gotta love the security#
log_type = FILE /var/log/phperrors.log
log_on_success += DURATION
}
now use fgets(STDIN) to read the input. Creates connections pretty quick, works like a charm.Writing can be done using the STDOUT, or just echo. Be aware that you're completely bypassing the webserver and thus certain variables will not be available.
In trying to do AJAX with PHP and Javascript, I came upon an issue where the POST argument from the following javascript could not be read in via PHP 5 using the $_REQUEST or $_POST. I finally figured out how to read in the raw data using the php://input directive.
Javascript code:
=============
//create request instance
xhttp = new XMLHttpRequest();
// set the event handler
xhttp.onreadystatechange = serviceReturn;
// prep the call, http method=POST, true=asynchronous call
var Args = 'number='+NbrValue;
xhttp.open("POST", "http://<?php echo $_SERVER['SERVER_NAME'] ?>/webservices/ws_service.php", true);
// send the call with args
xhttp.send(Args);
PHP Code:
//read the raw data in
$roughHTTPPOST = file_get_contents("php://input");
//parse it into vars
parse_str($roughHTTPPOST);
Here is a snippet to read compressed raw post data without enabling global variables.
I needed it to read xml posted data submitted by ocs agent. The data was sent as Content-Type: application/x-compressed (zlib compressed data).
It seems related to an old bug which still seems broken :
https://bugs.php.net/bug.php?id=49411
The important part is the default window set to 15 instead of -15.
Code snippet
<?php
$data = '';
$fh = fopen('php://input', 'rb');
stream_filter_append($fh, 'zlib.inflate', STREAM_FILTER_READ, array('window'=>15));
while(!feof($fh)) {
$data .= fread($fh, 8192);
}
?>
The stream php://temp/maxmemory:$limit stores the data in memory unless the limit is reached. Then it will write the whole content the a temporary file and frees the memory. I didnt found a way to get at least some of the data back to memory.
Even though their names will be the same, you can have more than one //memory or //temp stream open concurrently; each time you fopen() such a stream, a NEW stream will be opened independently of the others.
This is hinted at by the fact you don't add any unique identifier to the path when creating such streams, but isn't said explicitly.
<?php
$hello = fopen('php://memory', 'r+'); // $hello, $php, $world are all different streams.
$php = fopen('php://memory', 'r+');
$world = fopen('php://memory', 'r+'); // They're not the same stream opened three times.
fputs($hello, "Hello ");
fputs($php, "PHP ");
rewind($php);
fputs($world, "World!");
rewind($hello);
rewind($world);
echo '[', stream_get_contents($hello), '][', stream_get_contents($php), '][', stream_get_contents($world), ']';
// If they were the same stream the output would be "[World!][World!][World!]".
?>
You can decompress (gzip) a input stream by combining wrappers:
eg: $x = file_get_contents("compress.zlib://php://input");
I used this method to decompress a gzip stream that was pushed to my webserver
For reading a XML stream, this will work just fine:
<?php
$arq = file_get_contents('php://input');
?>
Then you can parse the XML like this:
<?php
$xml = xml_parser_create();
xml_parse_into_struct($xml, $arq, $vs);
xml_parser_free($xml);
$data = "";
foreach($vs as $v){
if($v['level'] == 3 && $v['type'] == 'complete')
$data .= "\n".$v['tag']." -> ".$v['value'];
}
echo $data;
?>
PS.: This is particularly useful for receiving mobile originated (MO) SMS messages from cellular phone companies.
When opening php://output in append mode you get an error, the way to do it:
$fp=fopen("php://output","w");
fwrite($fp,"Hello, world !<BR>\n");
fclose($fp);
While writing to error stream, error_log() function comes as a shorthand to writing to php://stderr . This function also allows writing to web server log when running through a web server such as apache.
Be forewarned:
the file:// protocol used in file_get_contents is used as the default for "any unrecognized protocol." Thus:
aldfjadlfadfladfl://whatever
will deliver the same as
file://whatever
If you want to filter incoming data through php://input use this:
file_get_contents("php://filter/read=string.strip_tags/resource=php://input");
I couldn't find any documentation to explain how to do this. All the examples I came across suggested that a full and actual URL had to be used (which didn't work for me).
This seems to work though.
If my understanding of the implementing code is correct, every time you open a php://memory stream, you get new storage allocated. That is to say, php://memory isn't a shared bank of memory.
[ Editor's Note: There is a way to know. All response headers (from both the final responding server and intermediate redirecters) can be found in $http_response_header or stream_get_meta_data() as described above. ]
If you open an HTTP url and the server issues a Location style redirect, the redirected contents will be read but you can't find out that this has happened.
So if you then parse the returned html and try and rationalise relative URLs you could get it wrong.
If you're looking for a unix based smb wrapper there isn't one built in, but I've had luck with http://www.zevils.com/cgi-bin/viewcvs.cgi/libsmbclient-php/ (tarball link at the end).
Not only are STDIN, STDOUT, and STDERR only allowed for CLI programs, but they are not allowed for programs that are read from STDIN. That can confuse you if you try to type in a simple test program.
I find using file_get_contents with php://input is very handy and efficient. Here is the code:
$request = "";
$request = file_get_contents("php://input");
I don't need to declare the URL filr string as "r". It automatically handles open the file with read.
I can then use this $request string to your XMLparser as data.
Be aware of code injection, folks - like anything else you take from the user, SANITISE IT FIRST. This cannot be stressed enough - if I had a dollar for each time I saw code where form input was taken and directly used (by myself as well, I've been stupid too) I'd probably own PHP. While using data from a form in a URL wrapper is asking for trouble, you can greatly minimise the trouble by making sure your inputs are sane and not likely to provide an opening for the LulzSec of the world to cause havoc.
For https for windows enable this extension:
extension=php_openssl.dll
php://stdin supports fseek() and fstat() function call,
while php://input doesn't.
followup:
I found that if I added this line to the AJAX call, the values would show up in the $_POST
xhttp.setRequestHeader('Content-Type',
'application/x-www-form-urlencoded');
The php://fd/ wrapper is only supported in the cli tool.
A useful way to handle large file uploads is to do something like:
copy(("php://input"),$tmpfile);
as this avoids using lots of memory just to buffer the file content.
The correct mime type for this should be "application/octet-stream" however if you set this or any other recognised mime type other than "multipart/form-data" on your POST then $HTTP_RAW_POST_DATA is populated and the memory is consumed anyway.
Setting the mime type to "multipart/form-data" raises “PHP Warning: Missing boundary in multipart/form-data POST data in Unknown on line 0” however it seems to work without a problem.
For php://filter the /resource=foo part must come last. And foo needs no escaping at all.
php://filter/resource=foo/read=somefilter would try to open a file 'foo/read=somefilter' while php://filter/read=somefilter/resource=foo will open file 'foo' with the somefilter filter applied.
The use of php://temp/maxmemory as a stream counts towards the memory usage of the script; you are not specifying a new memory pool by using this type of stream.
As noted in the documentation however, this stream type will start to write to a file after the specified maxmemory limit is exceeded. This file buffer is NOT observed by the memory limit.
This is handy if you want your script to have a reasonably small memory limit (eg 32MB) but but still be able to handle a huge amount of data in a stream (eg 256MB)
The only works if you use stream functions like fputs(); if you use $buffer .= 'string'; or $buffer = $buffer . 'string'; you're calling your stream data back into PHP and this will hit the limiter.
As a practical example:
<?php
// 0.5MB memory limit
ini_set('memory_limit', '0.5M');
// 2MB stream limit
$buffer = fopen('php://temp/maxmemory:1048576', 'r+');
$x = 0;
// Attempt to write 1MB to the stream
while ($x < 1*1024*1024) {
fputs($buffer, 'a');
$x++;
}
echo "This will never be displayed";
?>
However, change fopen to use php://temp/maxmemory:1 (one byte, rather than one megabyte) and it will begin writing to the unlimited file stream immediately, avoiding memory limit errors.
<?php
//enable $HTTP_RAW_POST_DATA when necessary
ini_set('always_populate_raw_post_data',-1);
$HTTP_RAW_POST_DATA = file_get_contents('php://input');
echo $HTTP_RAW_POST_DATA;
?>
In PHP 5.4+ you can read multipart data via php://input if you set enable_post_data_reading to Off.
Of course if you set it to off, the $_POST and $_FILES superglobals won't be populated at all. It's entirely up to you to parse the data now.
Each stream pointer to php://memory and php://temp has its own memory allocation, so you can open many stream pointers to store your separated values.
<?php
$fp = fopen("php://temp", "r+");
$fp2 = fopen("php://temp", "r+");
fwrite($fp, "line1\n");
fwrite($fp2, "line4\n");
fwrite($fp, "line2\n");
fwrite($fp2, "line5\n");
fwrite($fp, "line3\n");
fwrite($fp2, "line6\n");
var_dump(memory_get_usage());
rewind($fp);
while(!feof($fp)) {
var_dump(fread($fp, 1024));
}
fclose($fp);
var_dump(memory_get_usage());
rewind($fp2);
while(!feof($fp2)) {
var_dump(fread($fp2, 1024));
}
fclose($fp2);
var_dump(memory_get_usage());
?>
Closing their stream handles will also free the allocated memory.
php://memory stream type is MEMORY, while php://temp stream type is STDIO FILE*.