/    Sign up×
Community /Pin to ProfileBookmark

I have this site which allows user to search list base on their own criteria. Instead of showing all the records that are matched, I’m placing navigation button and like to show five at a time only. For that matter, should I caches the resultset by myself and transfer 5 at a time or is there any function provided by mysql or php? I’ve test persistent connection with mysql limit statement but doesn’t work. Any help would be appreciated.

to post a comment
PHP

6 Comments(s)

Copy linkTweet thisAlerts:
@SrWebDeveloperJan 08.2010 — I've test persistent connection with mysql limit statement but doesn't work.[/QUOTE]

Please explain specifically what led you to that conclusion, i.e. example SQL query and PHP code if possible, and why you mentioned persistent connections which I can't see makes a difference for your question, as stated?

In general, all search queries should have practical limits as to the number of total rows returned. But it's also true a good search engine uses sensible forms and validation to ensure proper boolean logic, proper use of wildcards, bad word filter, matches based on say 2-3 or more letters, common words "a", "the", etc. removed all to help reduce the rows returned and [B]optimize[/B] the query. So consider all this in your overall design.

I personally like this approach as to user initiated searches:

1) Limit to 5, 10, 20 or 50 results per page - make it user selectable as a select menu or whatever in your form

2) Run a query returning only total results without any limits imposed

3) Run another query with data being returned with page limit imposed

4) Use Ajax to populate a div displaying the results formatted as you wish, and add a "More..." button or link or pagination and continue to load each page using the offset attribute of the limit function in MySQL properly.

"Caching" is not a replacement for above, i.e. you never want to pre-load all matching data in a single result set with no limit imposed. Works great on a small site, but a site with lots of connections and large data set this will have a serious hit on server performance and users will notice. However, caching in the sense of storing "common" queries based on often used keywords is a sensible way of returning data you know doesn't change often. Caching is excellent for spider software that crawls your site and indexes the search database as well, but that's not the same as the search tool used by the client which I believe is the focus of this topic.

Unless you had another idea in mind as to caching?
Copy linkTweet thisAlerts:
@homer_j_simpsonauthorJan 08.2010 — I'm planning to implement no.4 method of yours. I've tried mysql_pconnect with LIMIT statement, expecting that it would have some sort of caching scheme and knows which records have already been picked up and start when it left off next time query execute.

$a = mysql_pconnect("","","");

$b = mysql_select_db("[db]",$a);

$c = msql_query("Select * From [table] WHERE[Criteria] LIMIT[No]",$a);
Copy linkTweet thisAlerts:
@SrWebDeveloperJan 08.2010 — The pconnect simply means a persistent connection for that user to the MySQL server, and has no direct bearing on data or result sets (no "caching").

When any query is made a connection is established between the two, persistency simply means the initial connection state is maintained so for successive queries fro the same user there is no need to re-login with username and password. This means faster response due to less overhead.


Full details here (read the description part closely)
Copy linkTweet thisAlerts:
@homer_j_simpsonauthorJan 08.2010 — The pconnect simply means a persistent connection for that user to the MySQL server, and has no direct bearing on data or result sets (no "caching").

When any query is made a connection is established between the two, persistency simply means the initial connection state is maintained so for successive queries fro the same user there is no need to re-login with username and password. This means faster response due to less overhead.


Full details here (read the description part closely)[/QUOTE]

Thanks for the explanation. Do you know solution to my problem?
Copy linkTweet thisAlerts:
@SrWebDeveloperJan 08.2010 — [B]How to constrain results:

[/B]


The LIMIT clause can be used to constrain the number of rows returned by the SELECT statement. LIMIT takes one or two numeric arguments. With two arguments, the first argument specifies the offset of the first row to return, and the second specifies the maximum number of rows to return. The offset of the initial row is 0 (not 1):

SELECT * FROM tbl LIMIT 5,10; # Retrieve rows 6-15

With one argument, the value specifies the number of rows to return from the beginning of the result set:

SELECT * FROM tbl LIMIT 5; # Retrieve first 5 rows

In other words, LIMIT row_count is equivalent to LIMIT 0, row_count.

[B]Caching:[/B]

You learned from this discussion that persistent connections has nothing to do with data caching. See my previous reply as to how caching can be implemented during indexing, but don't expect code - you need to research that on your own, nor do I think you even need to worry about it. As you plan to use Ajax like I mentioned, I suggested one query to keep track of total results per query, then another query using limit and the offset attribute to display X rows per page up to the total results.

If you run MySQL 5.1 -- SQL_CALC_FOUND_ROWS tells MySQL to calculate how many rows there would be in the result set, disregarding any LIMIT clause. The number of rows can then be retrieved with SELECT FOUND_ROWS(). Detailed help on this is found here.

I've supplied plenty of links to help you achieve your goals and respond to all queries posted here. Enjoy!

-jim
Copy linkTweet thisAlerts:
@homer_j_simpsonauthorJan 08.2010 — Thanks a bunch. I'll see how it goes. ?
×

Success!

Help @homer_j_simpson spread the word by sharing this article on Twitter...

Tweet This
Sign in
Forgot password?
Sign in with TwitchSign in with GithubCreate Account
about: ({
version: 0.1.9 BETA 5.3,
whats_new: community page,
up_next: more Davinci•003 tasks,
coming_soon: events calendar,
social: @webDeveloperHQ
});

legal: ({
terms: of use,
privacy: policy
});
changelog: (
version: 0.1.9,
notes: added community page

version: 0.1.8,
notes: added Davinci•003

version: 0.1.7,
notes: upvote answers to bounties

version: 0.1.6,
notes: article editor refresh
)...
recent_tips: (
tipper: @Yussuf4331,
tipped: article
amount: 1000 SATS,

tipper: @darkwebsites540,
tipped: article
amount: 10 SATS,

tipper: @Samric24,
tipped: article
amount: 1000 SATS,
)...