PHRETS
PHRETS copied to clipboard
Pagination isn't working. User or Server issue?
Hello,
I am trying to use pagination with my RETS request. I am following the example from the video PHRETS: Logging. I thought that by setting the last argument of the Search function to true then this would perform a recursive search thus using pagination. I'm not sure if this is true or not.
How do I make sure my RETS request is set to use pagination? How do I know if the server is set up to use pagination?
My ultimate goal is to limit the peak memory usage and was wondering if using pagination would a solution. Is this true? If I use pagination would the amount of memory be less than not using pagination?
I want to make sure my application won't crash if the request is abnormally large.
Code:
<?php
date_default_timezone_set('America/Toronto');
require_once("vendor/autoload.php");
$config = new \PHRETS\Configuration;
$config->setLoginUrl('rets_url')
->setUsername('username')
->setPassword('password')
->setOption('use_post_method', true)
->setHttpAuthenticationMethod('digest')
->setOption('disable_follow_location', false)
->setRetsVersion('1.7');
$rets = new \PHRETS\Session($config);
// If you're using Monolog already for logging, you can pass that logging instance to PHRETS for some additional
// insight into what PHRETS is doing.
//
$log = new \Monolog\Logger('PHRETS');
$log->pushHandler(new \Monolog\Handler\StreamHandler('php://stdout', \Monolog\Logger::DEBUG));
$log->pushHandler(new \Monolog\Handler\StreamHandler('./sample_log.log', \Monolog\Logger::DEBUG));
$rets->setLogger($log);
$connect = $rets->Login();
$query = "(Timestamp_sql=2017-11-26+)";
$property_class = array("ResidentialProperty");
foreach ($property_class as $pc) {
// make the request and get the results
$results = $rets->Search('Property', $pc, $query, ['Limit'=>50], true);
// save the results in a local file
file_put_contents('data/Property_' . $pc . '.csv', $results->toCSV());
}
?>
Output:
[2017-11-28 16:54:48] PHRETS.DEBUG: Loading Monolog\Logger logger [] []
[2017-11-28 16:54:48] PHRETS.DEBUG: Sending HTTP Request for rets_url (Login) {"auth":["username","password","digest"],"headers":{"User-Agent":"PHRETS/2.0","RETS-Version":"RETS/1.7","Accept-Encoding":"gzip","Accept":"*/*"},"curl":{"10031":"/tmp/phrets71b7W1"}} []
[2017-11-28 16:54:48] PHRETS.DEBUG: Using POST method per use_post_method option [] []
[2017-11-28 16:54:48] PHRETS.DEBUG: Response: HTTP 200 [] []
[2017-11-28 16:54:48] PHRETS.DEBUG: Sending HTTP Request for rets_url (Action) {"auth":["username","password","digest"],"headers":{"User-Agent":"PHRETS/2.0","RETS-Version":"RETS/1.7","Accept-Encoding":"gzip","Accept":"*/*"},"curl":{"10031":"/tmp/phretsS5gIOF"}} []
[2017-11-28 16:54:48] PHRETS.DEBUG: Using POST method per use_post_method option [] []
[2017-11-28 16:54:48] PHRETS.DEBUG: Response: HTTP 200 [] []
[2017-11-28 16:54:48] PHRETS.DEBUG: Sending HTTP Request for rets_url (Search) {"auth":["username","password","digest"],"headers":{"User-Agent":"PHRETS/2.0","RETS-Version":"RETS/1.7","Accept-Encoding":"gzip","Accept":"*/*"},"curl":{"10031":"/tmp/phretsl93IKj"},"query":{"SearchType":"Property","Class":"ResidentialProperty","Query":"(Timestamp_sql=2017-11-26+)","QueryType":"DMQL2","Count":1,"Format":"COMPACT-DECODED","Limit":50,"StandardNames":0}} []
[2017-11-28 16:54:48] PHRETS.DEBUG: Using POST method per use_post_method option [] []
[2017-11-28 16:54:50] PHRETS.DEBUG: Response: HTTP 200 [] []
[2017-11-28 16:54:50] PHRETS.DEBUG: 221 column headers/fields given [] []
[2017-11-28 16:54:50] PHRETS.DEBUG: 4031 total results found [] []
[2017-11-28 16:54:50] PHRETS.DEBUG: 50 results given [] []
In this case, it looks like the issue is with the server. When a Limit is requested and the server has more records to provide above that limit, it's supposed to return a MAXROWS element which tells the client that there's more to get. In this case, the server isn't returning that and so PHRETS believes that it's done.
There are a few options for you I think:
- Do pagination yourself (don't set the 5th param), or
- Implement a custom parser that tells PHRETS to behave differently.
For the custom parser, you'd basically need to:
- Make your own class which extends
\PHRETS\Parsers\Search\RecursiveOneX - In your class, declare a new
continuePaginating()method that does something likereturn ($rs->getReturnedResultsCount() < $rs->getTotalResultsCount()); - Once you make your
Sessionobject, do:$session->setParser(\PHRETS\Strategies::PARSER_SEARCH_RECURSIVE, new YourCustomParser());
and then PHRETS should use your class for controlling automatic pagination instead of it's default, standard behavior.
To answer your question about memory, using automatic pagination does require more memory since PHRETS will collect all records before returning them to you. I have a proof-of-concept working which will allow the best of both worlds (PHRETS manages the retrieval of all records but memory usage is still very low) but it's not ready for release yet.
Any update to support pagination?
Maybe by adding callback function to search
@budirec pagination according to the RETS standard is already supported. Please see my earlier comment for details on why this isn't working with this particular server. RETS is a general standard that requires each vendor to do their own implementation, and with optional features like pagination, it may not be implemented in the correct way (if at all).
I'm sorry, I was referring to your second comment about the proof of concept. My goal with the pagination is to save memory by processing the result 1 page at a time, so we didn't have a very big result and has to deal with them all at once. For now, I'll try the custom parser you mention
thanks
I used below code for pagination https://github.com/troydavisson/PHRETS/wiki/Connect,-download-listing-data-in-csv-format,-disconnect
But i am confused that how that pagination working and can me please clarify limit and offset parameters ?
And is this code for pagination ?
Please reply me as soon as possible.
I'm sorry, I was referring to your second comment about the proof of concept. My goal with the pagination is to save memory by processing the result 1 page at a time, so we didn't have a very big result and has to deal with them all at once. For now, I'll try the custom parser you mention
thanks
Hello, Can you please send me pagination code if you have ?
Sorry, I did have. Currently I limit the search per zip and multiple filters. So it never return crazy results.
On Thu, Jul 30, 2020, 20:55 nxsol593 [email protected] wrote:
I'm sorry, I was referring to your second comment about the proof of concept. My goal with the pagination is to save memory by processing the result 1 page at a time, so we didn't have a very big result and has to deal with them all at once. For now, I'll try the custom parser you mention
thanks
Hello, Can you please send me pagination code if you have ?
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/troydavisson/PHRETS/issues/181#issuecomment-666901157, or unsubscribe https://github.com/notifications/unsubscribe-auth/ACWYEREGRNZBT7GELKTHQ4TR6I6ETANCNFSM4EFYATOQ .
Sorry, I did have. Currently I limit the search per zip and multiple filters. So it never return crazy results. … On Thu, Jul 30, 2020, 20:55 nxsol593 @.***> wrote: I'm sorry, I was referring to your second comment about the proof of concept. My goal with the pagination is to save memory by processing the result 1 page at a time, so we didn't have a very big result and has to deal with them all at once. For now, I'll try the custom parser you mention thanks Hello, Can you please send me pagination code if you have ? — You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub <#181 (comment)>, or unsubscribe https://github.com/notifications/unsubscribe-auth/ACWYEREGRNZBT7GELKTHQ4TR6I6ETANCNFSM4EFYATOQ .
Ok no problem Thanks
In this case, it looks like the issue is with the server. When a Limit is requested and the server has more records to provide above that limit, it's supposed to return a
MAXROWSelement which tells the client that there's more to get. In this case, the server isn't returning that and so PHRETS believes that it's done.There are a few options for you I think:
- Do pagination yourself (don't set the 5th param), or
- Implement a custom parser that tells PHRETS to behave differently.
For the custom parser, you'd basically need to:
- Make your own class which extends
\PHRETS\Parsers\Search\RecursiveOneX- In your class, declare a new
continuePaginating()method that does something likereturn ($rs->getReturnedResultsCount() < $rs->getTotalResultsCount());- Once you make your
Sessionobject, do:$session->setParser(\PHRETS\Strategies::PARSER_SEARCH_RECURSIVE, new YourCustomParser());and then PHRETS should use your class for controlling automatic pagination instead of it's default, standard behavior.
Hi, I used wp property importer plugin. I need to pagination fro importer because at a time all properties import taking mote time and sometime it generated time out error. Can you please provide me code for pagination for importer plugin ?